Nov 22 02:52:36 crc systemd[1]: Starting Kubernetes Kubelet... Nov 22 02:52:36 crc restorecon[4748]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:36 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:37 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 02:52:39 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:40 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:41 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:42 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 02:52:43 crc restorecon[4748]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 22 02:52:44 crc kubenswrapper[4922]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:52:44 crc kubenswrapper[4922]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 22 02:52:44 crc kubenswrapper[4922]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:52:44 crc kubenswrapper[4922]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:52:44 crc kubenswrapper[4922]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 22 02:52:44 crc kubenswrapper[4922]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.927806 4922 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946439 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946471 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946483 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946493 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946501 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946510 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946518 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946525 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946533 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946541 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946549 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946557 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946564 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946572 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946579 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946606 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946614 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946622 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946630 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946638 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946646 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946654 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946661 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946669 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946677 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946685 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946692 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946700 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946707 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946714 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946722 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946730 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946737 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946745 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946752 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946760 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946768 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946775 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946788 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946798 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946807 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.946909 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947032 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947042 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947050 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947058 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947066 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947080 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947091 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947100 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947108 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947117 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947125 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947133 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947150 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947158 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947166 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947334 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947345 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947353 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947361 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947369 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947377 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947385 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947393 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947402 4922 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947410 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947424 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947433 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947448 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.947459 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950002 4922 flags.go:64] FLAG: --address="0.0.0.0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950056 4922 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950591 4922 flags.go:64] FLAG: --anonymous-auth="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950613 4922 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950629 4922 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950642 4922 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950657 4922 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950673 4922 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950685 4922 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950694 4922 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950704 4922 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950714 4922 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950724 4922 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950734 4922 flags.go:64] FLAG: --cgroup-root="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950743 4922 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950753 4922 flags.go:64] FLAG: --client-ca-file="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950762 4922 flags.go:64] FLAG: --cloud-config="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950771 4922 flags.go:64] FLAG: --cloud-provider="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950780 4922 flags.go:64] FLAG: --cluster-dns="[]" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950795 4922 flags.go:64] FLAG: --cluster-domain="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950804 4922 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950813 4922 flags.go:64] FLAG: --config-dir="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950822 4922 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950832 4922 flags.go:64] FLAG: --container-log-max-files="5" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950873 4922 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950883 4922 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950891 4922 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950901 4922 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950912 4922 flags.go:64] FLAG: --contention-profiling="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950921 4922 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950931 4922 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950943 4922 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950953 4922 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.950999 4922 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951008 4922 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951018 4922 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951027 4922 flags.go:64] FLAG: --enable-load-reader="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951036 4922 flags.go:64] FLAG: --enable-server="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951045 4922 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951058 4922 flags.go:64] FLAG: --event-burst="100" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951069 4922 flags.go:64] FLAG: --event-qps="50" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951078 4922 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951088 4922 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951097 4922 flags.go:64] FLAG: --eviction-hard="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951108 4922 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951119 4922 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951129 4922 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951138 4922 flags.go:64] FLAG: --eviction-soft="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951147 4922 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951157 4922 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951166 4922 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951175 4922 flags.go:64] FLAG: --experimental-mounter-path="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951183 4922 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951192 4922 flags.go:64] FLAG: --fail-swap-on="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951201 4922 flags.go:64] FLAG: --feature-gates="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951212 4922 flags.go:64] FLAG: --file-check-frequency="20s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951221 4922 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951230 4922 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951239 4922 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951248 4922 flags.go:64] FLAG: --healthz-port="10248" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951258 4922 flags.go:64] FLAG: --help="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951268 4922 flags.go:64] FLAG: --hostname-override="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951277 4922 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951287 4922 flags.go:64] FLAG: --http-check-frequency="20s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951297 4922 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951309 4922 flags.go:64] FLAG: --image-credential-provider-config="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951320 4922 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951331 4922 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951342 4922 flags.go:64] FLAG: --image-service-endpoint="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951353 4922 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951365 4922 flags.go:64] FLAG: --kube-api-burst="100" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951375 4922 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951387 4922 flags.go:64] FLAG: --kube-api-qps="50" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951397 4922 flags.go:64] FLAG: --kube-reserved="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951406 4922 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951415 4922 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951424 4922 flags.go:64] FLAG: --kubelet-cgroups="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951433 4922 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951442 4922 flags.go:64] FLAG: --lock-file="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951450 4922 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951461 4922 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951470 4922 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951483 4922 flags.go:64] FLAG: --log-json-split-stream="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951492 4922 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951501 4922 flags.go:64] FLAG: --log-text-split-stream="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951510 4922 flags.go:64] FLAG: --logging-format="text" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951519 4922 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951529 4922 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951540 4922 flags.go:64] FLAG: --manifest-url="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951549 4922 flags.go:64] FLAG: --manifest-url-header="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951562 4922 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951572 4922 flags.go:64] FLAG: --max-open-files="1000000" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951583 4922 flags.go:64] FLAG: --max-pods="110" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951592 4922 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951601 4922 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951610 4922 flags.go:64] FLAG: --memory-manager-policy="None" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951620 4922 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951629 4922 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951638 4922 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951647 4922 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951675 4922 flags.go:64] FLAG: --node-status-max-images="50" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951684 4922 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951693 4922 flags.go:64] FLAG: --oom-score-adj="-999" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951702 4922 flags.go:64] FLAG: --pod-cidr="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951710 4922 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951725 4922 flags.go:64] FLAG: --pod-manifest-path="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951734 4922 flags.go:64] FLAG: --pod-max-pids="-1" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951743 4922 flags.go:64] FLAG: --pods-per-core="0" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951752 4922 flags.go:64] FLAG: --port="10250" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951761 4922 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951770 4922 flags.go:64] FLAG: --provider-id="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951779 4922 flags.go:64] FLAG: --qos-reserved="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951788 4922 flags.go:64] FLAG: --read-only-port="10255" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951809 4922 flags.go:64] FLAG: --register-node="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951818 4922 flags.go:64] FLAG: --register-schedulable="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951827 4922 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951876 4922 flags.go:64] FLAG: --registry-burst="10" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951885 4922 flags.go:64] FLAG: --registry-qps="5" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951894 4922 flags.go:64] FLAG: --reserved-cpus="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951903 4922 flags.go:64] FLAG: --reserved-memory="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951916 4922 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951926 4922 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951935 4922 flags.go:64] FLAG: --rotate-certificates="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951944 4922 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951954 4922 flags.go:64] FLAG: --runonce="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951964 4922 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951973 4922 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951983 4922 flags.go:64] FLAG: --seccomp-default="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.951993 4922 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952037 4922 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952048 4922 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952058 4922 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952068 4922 flags.go:64] FLAG: --storage-driver-password="root" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952077 4922 flags.go:64] FLAG: --storage-driver-secure="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952086 4922 flags.go:64] FLAG: --storage-driver-table="stats" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952095 4922 flags.go:64] FLAG: --storage-driver-user="root" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952103 4922 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952113 4922 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952122 4922 flags.go:64] FLAG: --system-cgroups="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952139 4922 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952155 4922 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952164 4922 flags.go:64] FLAG: --tls-cert-file="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952173 4922 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952185 4922 flags.go:64] FLAG: --tls-min-version="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952194 4922 flags.go:64] FLAG: --tls-private-key-file="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952203 4922 flags.go:64] FLAG: --topology-manager-policy="none" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952212 4922 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952220 4922 flags.go:64] FLAG: --topology-manager-scope="container" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952230 4922 flags.go:64] FLAG: --v="2" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952241 4922 flags.go:64] FLAG: --version="false" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952252 4922 flags.go:64] FLAG: --vmodule="" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952263 4922 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.952272 4922 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952498 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952508 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952516 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952525 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952533 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952541 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952550 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952560 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952569 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952580 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952590 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952599 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952608 4922 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952617 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952625 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952635 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952643 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952651 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952661 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952669 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952677 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952687 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952697 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952706 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952713 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952721 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952732 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952742 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952750 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952759 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952766 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952775 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952783 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952791 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952798 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952806 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952814 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952822 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952829 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952838 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952870 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952878 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952886 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952896 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952904 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952912 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952919 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952927 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952935 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952942 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952950 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952957 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952965 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952973 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952981 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952989 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.952996 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953005 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953012 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953020 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953028 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953036 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953043 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953051 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953090 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953101 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953108 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953116 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953124 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953132 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.953139 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.953165 4922 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.988829 4922 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.988904 4922 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.988992 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989004 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989010 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989016 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989021 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989026 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989031 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989036 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989061 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989066 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989071 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989075 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989079 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989084 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989088 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989093 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989098 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989102 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989106 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989111 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989115 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989119 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989126 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989131 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989136 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989140 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989145 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989152 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989156 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989161 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989167 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989175 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989181 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989186 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989192 4922 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989196 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989201 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989206 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989210 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989214 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989219 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989224 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989231 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989238 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989244 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989249 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989254 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989260 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989265 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989272 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989277 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989282 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989288 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989294 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989300 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989306 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989311 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989316 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989321 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989326 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989331 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989336 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989341 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989346 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989351 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989359 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989364 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989369 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989374 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989380 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989386 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.989396 4922 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989580 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989589 4922 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989595 4922 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989600 4922 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989605 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989611 4922 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989616 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989623 4922 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989630 4922 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989636 4922 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989641 4922 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989647 4922 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989653 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989658 4922 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989664 4922 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989669 4922 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989675 4922 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989680 4922 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989685 4922 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989690 4922 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989695 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989703 4922 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989708 4922 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989714 4922 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989720 4922 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989726 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989733 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989738 4922 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989744 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989749 4922 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989755 4922 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989761 4922 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989767 4922 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989772 4922 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989778 4922 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989784 4922 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989789 4922 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989795 4922 feature_gate.go:330] unrecognized feature gate: Example Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989800 4922 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989805 4922 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989810 4922 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989815 4922 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989820 4922 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989825 4922 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989830 4922 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989835 4922 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989841 4922 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989863 4922 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989869 4922 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989873 4922 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989878 4922 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989883 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989888 4922 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989893 4922 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989900 4922 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989906 4922 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989914 4922 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989920 4922 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989928 4922 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989934 4922 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989939 4922 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989944 4922 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989949 4922 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989954 4922 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989959 4922 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989964 4922 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989970 4922 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989977 4922 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989982 4922 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989988 4922 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 02:52:44 crc kubenswrapper[4922]: W1122 02:52:44.989994 4922 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.990001 4922 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 02:52:44 crc kubenswrapper[4922]: I1122 02:52:44.993381 4922 server.go:940] "Client rotation is on, will bootstrap in background" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.007670 4922 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.007777 4922 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.010639 4922 server.go:997] "Starting client certificate rotation" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.010671 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.010928 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-15 18:08:41.155981127 +0000 UTC Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.011030 4922 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 567h15m56.144954903s for next certificate rotation Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.050009 4922 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.053657 4922 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.103904 4922 log.go:25] "Validated CRI v1 runtime API" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.170121 4922 log.go:25] "Validated CRI v1 image API" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.175007 4922 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.181951 4922 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-22-02-47-34-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.182002 4922 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.209608 4922 manager.go:217] Machine: {Timestamp:2025-11-22 02:52:45.207718215 +0000 UTC m=+1.246240147 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7e87d562-50ca-4b7a-8b5f-35220f4abd2d BootID:e949e6da-04e3-4086-acd2-53671701d8c1 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:05:0b:0f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:05:0b:0f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:42:8d:08 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:eb:99:53 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c1:58:f9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b1:77:76 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e6:1c:b4 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:fe:4f:46:4a:5c:1a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:36:eb:bb:22:94:34 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.209924 4922 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.210231 4922 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.210608 4922 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.210866 4922 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.210917 4922 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.211759 4922 topology_manager.go:138] "Creating topology manager with none policy" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.211785 4922 container_manager_linux.go:303] "Creating device plugin manager" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.212226 4922 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.212286 4922 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.213005 4922 state_mem.go:36] "Initialized new in-memory state store" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.213121 4922 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.217272 4922 kubelet.go:418] "Attempting to sync node with API server" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.217298 4922 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.217317 4922 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.217330 4922 kubelet.go:324] "Adding apiserver pod source" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.217392 4922 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.223981 4922 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.224746 4922 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.226481 4922 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 22 02:52:45 crc kubenswrapper[4922]: W1122 02:52:45.228285 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.228423 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:45 crc kubenswrapper[4922]: W1122 02:52:45.228318 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.228525 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229712 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229746 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229756 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229766 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229780 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229789 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229796 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229808 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229832 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229862 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229877 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.229884 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.232412 4922 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.232878 4922 server.go:1280] "Started kubelet" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.234710 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:45 crc systemd[1]: Started Kubernetes Kubelet. Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.235780 4922 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.235839 4922 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.236668 4922 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.237669 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.237742 4922 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.238028 4922 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.238064 4922 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.237989 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:09:43.836540895 +0000 UTC Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.238269 4922 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1272h16m58.598287371s for next certificate rotation Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.238360 4922 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.238477 4922 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.239159 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="200ms" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.239600 4922 factory.go:55] Registering systemd factory Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.240012 4922 factory.go:221] Registration of the systemd container factory successfully Nov 22 02:52:45 crc kubenswrapper[4922]: W1122 02:52:45.239519 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.240146 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.240370 4922 factory.go:153] Registering CRI-O factory Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.240411 4922 factory.go:221] Registration of the crio container factory successfully Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.240546 4922 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.240593 4922 factory.go:103] Registering Raw factory Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.240617 4922 manager.go:1196] Started watching for new ooms in manager Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.248008 4922 manager.go:319] Starting recovery of all containers Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.249181 4922 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a348891a73322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 02:52:45.232821026 +0000 UTC m=+1.271342918,LastTimestamp:2025-11-22 02:52:45.232821026 +0000 UTC m=+1.271342918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.250979 4922 server.go:460] "Adding debug handlers to kubelet server" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264296 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264389 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264417 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264443 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264469 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264493 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264516 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264544 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264574 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264596 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264618 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264644 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264669 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264737 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264762 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264785 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264922 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.264960 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267196 4922 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267309 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267439 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267528 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267560 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267636 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267666 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267748 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267816 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267903 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267937 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.267998 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268023 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268079 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268107 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268128 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268188 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268275 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268302 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268359 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268382 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268406 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268471 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268493 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268551 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268572 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268591 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268654 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268675 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.268836 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269013 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269038 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269099 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269127 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269221 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269289 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269315 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269375 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269401 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269422 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269483 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269503 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269601 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269715 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269834 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.269931 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270012 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270041 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270123 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270192 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270230 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270307 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270373 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270412 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270483 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270522 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270600 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270671 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270708 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.270789 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271006 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271099 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271144 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271231 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271303 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271342 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271416 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271481 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271519 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271587 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271622 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271700 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271730 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271790 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271824 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271902 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271941 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271969 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.271999 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272025 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272052 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272079 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272174 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272206 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272233 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272261 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272288 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272325 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272355 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272384 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272415 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272443 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272470 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272554 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272597 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272670 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272698 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272730 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272759 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272786 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272813 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272870 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272936 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272967 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.272994 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273021 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273048 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273075 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273101 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273129 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273160 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273187 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273274 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273311 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273340 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273371 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273399 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273427 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273453 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273482 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273556 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273590 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273620 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273648 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273673 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273700 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273727 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273752 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273780 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273807 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273830 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273891 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273919 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273944 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273971 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.273996 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274024 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274051 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274080 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274107 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274132 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274158 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274187 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274212 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274239 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274264 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274293 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274321 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274347 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274372 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274397 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274423 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274448 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274471 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274497 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274522 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274547 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274572 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274664 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274704 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274733 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274759 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274784 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274812 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274837 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274918 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274944 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274970 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.274996 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275024 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275048 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275075 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275098 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275123 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275149 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275175 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275204 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275228 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275251 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275276 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275300 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275326 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275350 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275375 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275399 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275423 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275451 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275479 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275503 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275530 4922 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275556 4922 reconstruct.go:97] "Volume reconstruction finished" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.275573 4922 reconciler.go:26] "Reconciler: start to sync state" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.276422 4922 manager.go:324] Recovery completed Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.291922 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.294218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.294288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.294308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.296532 4922 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.296562 4922 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.296592 4922 state_mem.go:36] "Initialized new in-memory state store" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.296823 4922 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.299147 4922 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.299218 4922 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.299254 4922 kubelet.go:2335] "Starting kubelet main sync loop" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.299336 4922 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 22 02:52:45 crc kubenswrapper[4922]: W1122 02:52:45.299983 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.300092 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.334052 4922 policy_none.go:49] "None policy: Start" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.335482 4922 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.335622 4922 state_mem.go:35] "Initializing new in-memory state store" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.339512 4922 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.399633 4922 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.421661 4922 manager.go:334] "Starting Device Plugin manager" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.421985 4922 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.422662 4922 server.go:79] "Starting device plugin registration server" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.423693 4922 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.423729 4922 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.424335 4922 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.424445 4922 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.424469 4922 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.434120 4922 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.440508 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="400ms" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.524645 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.526424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.526477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.526493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.526530 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.527278 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.600895 4922 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.601173 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.604590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.604684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.604770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.605119 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.605773 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.605832 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.606998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.607083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.607111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.607993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.608058 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.608084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.608297 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.608435 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.608498 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610255 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610764 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.610954 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.611022 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612525 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.612903 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.613333 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.613410 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.614563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.614614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.614639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.614741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.614761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.614774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.615166 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.615238 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.616828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.616946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.616976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.682923 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.682996 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683042 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683081 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683117 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683154 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683248 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683389 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683474 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683530 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683575 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.683657 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.728382 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.730788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.730948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.730979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.731046 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.732137 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785517 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785611 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785651 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785718 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785813 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785833 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785834 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785794 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785928 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785990 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786012 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786057 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786081 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.785878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786129 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786157 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786170 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786197 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786266 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786231 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786306 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.786405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: E1122 02:52:45.842013 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="800ms" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.952382 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.968716 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 02:52:45 crc kubenswrapper[4922]: I1122 02:52:45.992837 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.005390 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.008207 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.031308 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9e0f2834c3d75ca0a514d95ce0251e68863cb01e101985d0d315802b42fbb893 WatchSource:0}: Error finding container 9e0f2834c3d75ca0a514d95ce0251e68863cb01e101985d0d315802b42fbb893: Status 404 returned error can't find the container with id 9e0f2834c3d75ca0a514d95ce0251e68863cb01e101985d0d315802b42fbb893 Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.032096 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f7b32153f55a3dc38abde4b4d2c473e44b022d6164d879c7e845f8b0414e99a3 WatchSource:0}: Error finding container f7b32153f55a3dc38abde4b4d2c473e44b022d6164d879c7e845f8b0414e99a3: Status 404 returned error can't find the container with id f7b32153f55a3dc38abde4b4d2c473e44b022d6164d879c7e845f8b0414e99a3 Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.042402 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-757b86706916a387d4887539c3007f6fd55a734ef705f71be248af4610784497 WatchSource:0}: Error finding container 757b86706916a387d4887539c3007f6fd55a734ef705f71be248af4610784497: Status 404 returned error can't find the container with id 757b86706916a387d4887539c3007f6fd55a734ef705f71be248af4610784497 Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.047747 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a08059e32bb18700c5c6d1eaeca3120d6030d99e19d2fa9a273123d903415f1e WatchSource:0}: Error finding container a08059e32bb18700c5c6d1eaeca3120d6030d99e19d2fa9a273123d903415f1e: Status 404 returned error can't find the container with id a08059e32bb18700c5c6d1eaeca3120d6030d99e19d2fa9a273123d903415f1e Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.051397 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8ce9ba1252597a804d88b7fadc9e097f38f5a59c76964a691ae7a0be2c6c3c9e WatchSource:0}: Error finding container 8ce9ba1252597a804d88b7fadc9e097f38f5a59c76964a691ae7a0be2c6c3c9e: Status 404 returned error can't find the container with id 8ce9ba1252597a804d88b7fadc9e097f38f5a59c76964a691ae7a0be2c6c3c9e Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.132670 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.135115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.135180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.135202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.135240 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.136031 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.144787 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.144921 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.149157 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.149331 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.236666 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.292031 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.292138 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.303736 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9e0f2834c3d75ca0a514d95ce0251e68863cb01e101985d0d315802b42fbb893"} Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.305119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f7b32153f55a3dc38abde4b4d2c473e44b022d6164d879c7e845f8b0414e99a3"} Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.306261 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8ce9ba1252597a804d88b7fadc9e097f38f5a59c76964a691ae7a0be2c6c3c9e"} Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.307248 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a08059e32bb18700c5c6d1eaeca3120d6030d99e19d2fa9a273123d903415f1e"} Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.308119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"757b86706916a387d4887539c3007f6fd55a734ef705f71be248af4610784497"} Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.642892 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="1.6s" Nov 22 02:52:46 crc kubenswrapper[4922]: W1122 02:52:46.807277 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.807419 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.936662 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.938213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.938258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.938271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:46 crc kubenswrapper[4922]: I1122 02:52:46.938302 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:46 crc kubenswrapper[4922]: E1122 02:52:46.938909 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 22 02:52:47 crc kubenswrapper[4922]: I1122 02:52:47.236305 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:48 crc kubenswrapper[4922]: W1122 02:52:48.095028 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:48 crc kubenswrapper[4922]: E1122 02:52:48.095135 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.235953 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:48 crc kubenswrapper[4922]: E1122 02:52:48.244075 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="3.2s" Nov 22 02:52:48 crc kubenswrapper[4922]: W1122 02:52:48.266778 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:48 crc kubenswrapper[4922]: E1122 02:52:48.266902 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.317607 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf" exitCode=0 Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.317825 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.317817 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.319346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.319396 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.319410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.322303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.322371 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.322400 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.322640 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.324125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.324196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.324219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.324920 4922 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134" exitCode=0 Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.324972 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.325094 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.326668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.326727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.326757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.327897 4922 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="cd5ceb43d7f9e20cb867d08cabc1d1d37744308ef18248947c728071eb2bc47e" exitCode=0 Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.327974 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"cd5ceb43d7f9e20cb867d08cabc1d1d37744308ef18248947c728071eb2bc47e"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.328024 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.329288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.329329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.329342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.334865 4922 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9" exitCode=0 Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.334911 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9"} Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.335095 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.336691 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.336742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.336761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.539759 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.541008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.541040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.541049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:48 crc kubenswrapper[4922]: I1122 02:52:48.541075 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:48 crc kubenswrapper[4922]: E1122 02:52:48.541459 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 22 02:52:48 crc kubenswrapper[4922]: W1122 02:52:48.909726 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:48 crc kubenswrapper[4922]: E1122 02:52:48.909868 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:48 crc kubenswrapper[4922]: W1122 02:52:48.994639 4922 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:48 crc kubenswrapper[4922]: E1122 02:52:48.994749 4922 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.235973 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.340802 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f48e23eebd528cf7cc530b29a032d3bec8eebaa3a5fcaff1296124b9d23bc96a"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.341082 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.342414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.342448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.342461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.346176 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.346301 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.346384 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.348822 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.348886 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.348907 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.351753 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.351893 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.353446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.353546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.353610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.353744 4922 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd" exitCode=0 Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.353814 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd"} Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.353891 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.354622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.354698 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:49 crc kubenswrapper[4922]: I1122 02:52:49.354713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.236527 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.363027 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d"} Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.363136 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759"} Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.363082 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.364822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.364899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.364913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.367614 4922 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82" exitCode=0 Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.367717 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82"} Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.367755 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.367788 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.367807 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.367981 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.369744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.369784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.369797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.369869 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.369913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.369931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.370004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.370040 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.370063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.371140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.371168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.371178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.786212 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:50 crc kubenswrapper[4922]: I1122 02:52:50.830825 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.054720 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.237323 4922 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.373478 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.378218 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d" exitCode=255 Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.378279 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d"} Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.378491 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.379923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.379983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.380005 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.381032 4922 scope.go:117] "RemoveContainer" containerID="a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.381503 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613"} Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.381556 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3"} Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.381656 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.381746 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.383109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.383146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.383158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.384010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.384043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.384054 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:51 crc kubenswrapper[4922]: E1122 02:52:51.445997 4922 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="6.4s" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.741875 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.743544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.743587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.743612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.743658 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:51 crc kubenswrapper[4922]: E1122 02:52:51.744268 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Nov 22 02:52:51 crc kubenswrapper[4922]: I1122 02:52:51.944393 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.390151 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.395516 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed"} Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.395779 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.395932 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.397203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.397277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.397297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.402995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265"} Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.403099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa"} Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.403041 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.403225 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.405098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.405149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.405168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:52 crc kubenswrapper[4922]: I1122 02:52:52.560931 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.413171 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0"} Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.413260 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.413262 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.413194 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.414916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.414963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.414980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.415164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.415201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:53 crc kubenswrapper[4922]: I1122 02:52:53.415217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.107465 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.107718 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.109473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.109526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.109543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.416385 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.416385 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.418204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.418301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.418331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.418552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.418623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:54 crc kubenswrapper[4922]: I1122 02:52:54.418647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:55 crc kubenswrapper[4922]: E1122 02:52:55.434360 4922 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 02:52:55 crc kubenswrapper[4922]: I1122 02:52:55.773158 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:55 crc kubenswrapper[4922]: I1122 02:52:55.773514 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:55 crc kubenswrapper[4922]: I1122 02:52:55.775609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:55 crc kubenswrapper[4922]: I1122 02:52:55.775676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:55 crc kubenswrapper[4922]: I1122 02:52:55.775696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.056094 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.300527 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.300773 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.302227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.302307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.302449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.421146 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.422294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.422355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:56 crc kubenswrapper[4922]: I1122 02:52:56.422377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.144950 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.146832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.146999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.147022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.147071 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.774111 4922 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.774645 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.926973 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.927267 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.929707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.929769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:52:58 crc kubenswrapper[4922]: I1122 02:52:58.929780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:01 crc kubenswrapper[4922]: I1122 02:53:01.944608 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 02:53:01 crc kubenswrapper[4922]: I1122 02:53:01.945646 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 02:53:02 crc kubenswrapper[4922]: I1122 02:53:02.080223 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 02:53:02 crc kubenswrapper[4922]: I1122 02:53:02.080535 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 02:53:04 crc kubenswrapper[4922]: I1122 02:53:04.114599 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:04 crc kubenswrapper[4922]: I1122 02:53:04.114941 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:04 crc kubenswrapper[4922]: I1122 02:53:04.116532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:04 crc kubenswrapper[4922]: I1122 02:53:04.116567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:04 crc kubenswrapper[4922]: I1122 02:53:04.116579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:05 crc kubenswrapper[4922]: E1122 02:53:05.434651 4922 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.950141 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.950480 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.951067 4922 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.951198 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.952784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.952880 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.952905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:06 crc kubenswrapper[4922]: I1122 02:53:06.957650 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.076491 4922 trace.go:236] Trace[1011894242]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:52:52.879) (total time: 14196ms): Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[1011894242]: ---"Objects listed" error: 14196ms (02:53:07.076) Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[1011894242]: [14.196474829s] [14.196474829s] END Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.076542 4922 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.077821 4922 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.078781 4922 trace.go:236] Trace[396061267]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:52:53.450) (total time: 13628ms): Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[396061267]: ---"Objects listed" error: 13628ms (02:53:07.078) Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[396061267]: [13.628226959s] [13.628226959s] END Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.078828 4922 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.078961 4922 trace.go:236] Trace[1327950161]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:52:53.257) (total time: 13821ms): Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[1327950161]: ---"Objects listed" error: 13821ms (02:53:07.078) Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[1327950161]: [13.82151883s] [13.82151883s] END Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.078984 4922 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.079196 4922 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.084811 4922 trace.go:236] Trace[1390702884]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 02:52:53.366) (total time: 13718ms): Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[1390702884]: ---"Objects listed" error: 13718ms (02:53:07.084) Nov 22 02:53:07 crc kubenswrapper[4922]: Trace[1390702884]: [13.718630868s] [13.718630868s] END Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.084886 4922 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.231369 4922 apiserver.go:52] "Watching apiserver" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.234369 4922 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.234699 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.235123 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.235192 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.235224 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.235201 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.235300 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.235330 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.235364 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.235282 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.235893 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.237023 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.238538 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.238552 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.238604 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.238545 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.239034 4922 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.241792 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.244218 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.244495 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.244953 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.277737 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281124 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281198 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281226 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281253 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281299 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281320 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281345 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281374 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281396 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281418 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281486 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281511 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281534 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281554 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281576 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281722 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281721 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281823 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281863 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281856 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281952 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.281973 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282017 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282065 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282068 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282113 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282156 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282178 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282201 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282223 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282244 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282245 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282267 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282286 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282288 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282312 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282335 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282358 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282350 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282379 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282456 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282465 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282479 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282481 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282592 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282650 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282661 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282673 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282676 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282694 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282681 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282714 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282750 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282768 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282784 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282801 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282823 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282832 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283034 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283095 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283154 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283259 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283280 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283320 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283375 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283406 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283560 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283605 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283620 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.282817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283668 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283686 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284104 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284120 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284140 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284160 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283739 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283759 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.283873 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284024 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284302 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284178 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284466 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284484 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284503 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284574 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284617 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284618 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284640 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284661 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284678 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284694 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284722 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284740 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284757 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284773 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284788 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284803 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284819 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284821 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284898 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284924 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284947 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284964 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284980 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.284998 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285014 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285030 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285032 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285046 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285101 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285130 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285158 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285184 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285209 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285232 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285256 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285279 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285303 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285326 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285347 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285368 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285394 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285437 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285464 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285487 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285510 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285530 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285552 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285573 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285597 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285619 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285642 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285666 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285057 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285258 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285310 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285608 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.285694 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:53:07.785664647 +0000 UTC m=+23.824186539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285777 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285800 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285818 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285834 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285900 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285905 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285923 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285941 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285944 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285958 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285963 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285977 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286020 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286048 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286069 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286094 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286115 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286142 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286164 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286207 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286227 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286231 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286248 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286245 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286272 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286298 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286318 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286353 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286379 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286402 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286421 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286442 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286466 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286487 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286509 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286531 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286553 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286574 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286596 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286621 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286643 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286666 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286688 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286709 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286730 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286751 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286778 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286802 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286824 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286866 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286888 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286914 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286983 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287007 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287048 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287092 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287112 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287133 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287177 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287200 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287222 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287244 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287264 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287294 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287316 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287337 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287357 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287377 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287399 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287420 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287442 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287464 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287506 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287528 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287549 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287570 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287591 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287610 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287634 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287659 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287683 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287707 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287731 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287751 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289590 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289637 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289663 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289690 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292259 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292315 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292345 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292372 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292397 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319513 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319588 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319639 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319684 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319732 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319753 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319773 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319887 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319908 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319951 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319991 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320008 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320028 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320087 4922 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320101 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320113 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320123 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320134 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320143 4922 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320152 4922 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320162 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320173 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320185 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320195 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320204 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320214 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320225 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320235 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320244 4922 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320253 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320263 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320274 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320283 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320294 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320303 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320313 4922 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320322 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320332 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320341 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320351 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320362 4922 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320372 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320382 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320392 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320401 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320412 4922 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320422 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320432 4922 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320442 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320452 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320463 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320477 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320489 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320811 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320825 4922 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320916 4922 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320935 4922 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320946 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320964 4922 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320977 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320989 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321001 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321016 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321029 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321042 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321041 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321879 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286313 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286515 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286556 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286891 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286901 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.286990 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287184 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287355 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287686 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.287945 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.288045 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.288186 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.288364 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.288584 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.288786 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289447 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289662 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.289780 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.290024 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.290079 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.290098 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.290957 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292023 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292129 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292496 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292555 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292673 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292922 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.292937 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.285711 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.315812 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.317522 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319204 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319298 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319299 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319698 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.319943 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320036 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320029 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320249 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320352 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320515 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320523 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320575 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.320876 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321191 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321264 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321308 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321399 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321425 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321472 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321478 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.321974 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.322050 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.322294 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.322530 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.322759 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.322825 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.322964 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323016 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323015 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323101 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323148 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323179 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323435 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323536 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.323637 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.324188 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.324977 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:07.824957322 +0000 UTC m=+23.863479214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.326025 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.327646 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.328144 4922 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.329707 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.331883 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.332115 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.332529 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.333579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.333766 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.334492 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.334684 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.335575 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.335732 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.335921 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.336105 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.336128 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.336743 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.324242 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.339153 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:07.839125596 +0000 UTC m=+23.877647558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.339464 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.339493 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.339507 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.339591 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.342814 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:07.842786172 +0000 UTC m=+23.881308064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.343803 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.343996 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.344552 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.345825 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.345929 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.346806 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.347049 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.347341 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.347697 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.348926 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349131 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349156 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349160 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349181 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349334 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349487 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349509 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349659 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349696 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349813 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.349785 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.350631 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.349820 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.350726 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350127 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350163 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350480 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350351 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350506 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.350541 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.351767 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.351954 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.352010 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.352256 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.352285 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:07.852247854 +0000 UTC m=+23.890769936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.352451 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.352471 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.352656 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.353447 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.353857 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.354766 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.356035 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.356681 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.357490 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.357944 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358004 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358158 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358343 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358424 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358696 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358765 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.358920 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.359202 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.359201 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.359693 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.359786 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.360178 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.360940 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.360983 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361070 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361009 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361046 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361136 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361147 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361458 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361720 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361836 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361992 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.362022 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.361981 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.362081 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.362375 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.365392 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.365656 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.366805 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.367518 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.373600 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.375154 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.376267 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.378563 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.379406 4922 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.379581 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.383863 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.384627 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.386808 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.388874 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.390225 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.390906 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.391125 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.391697 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.392520 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.393403 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.395111 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.395772 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.397426 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.398270 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.398766 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.399589 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.400225 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.401599 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.403469 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.403743 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.405321 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.405874 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.406902 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.407452 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.408501 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.409161 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.409514 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.409658 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.414484 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.421870 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.421929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.421969 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.421996 4922 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422009 4922 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422019 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422030 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422039 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422051 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422060 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422069 4922 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422079 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422088 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422098 4922 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422110 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422120 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422131 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422140 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422149 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422158 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422166 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422174 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422183 4922 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422191 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422201 4922 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422212 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422222 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422256 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422388 4922 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422404 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422416 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422427 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422437 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422447 4922 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422457 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422468 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422477 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422487 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422496 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422505 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422514 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422524 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422535 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422545 4922 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422553 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422563 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422573 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422583 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422591 4922 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422600 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422609 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422618 4922 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422628 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422636 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422646 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422655 4922 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422665 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422675 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422685 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422694 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422703 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422713 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422722 4922 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422731 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422743 4922 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422757 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422769 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422780 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422791 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422801 4922 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422810 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422819 4922 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422835 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422863 4922 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422874 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422883 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422892 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422901 4922 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422911 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422921 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422932 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422942 4922 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422952 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422962 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422972 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422982 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.422991 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423001 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423010 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423019 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423029 4922 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423037 4922 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423047 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423057 4922 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423065 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423075 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423084 4922 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423093 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423101 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423109 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423118 4922 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423127 4922 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423135 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423146 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423154 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423162 4922 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423171 4922 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423179 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423186 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423195 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423204 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423212 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423220 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423229 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423238 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423246 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423256 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423265 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423273 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423282 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423293 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423302 4922 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423312 4922 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423321 4922 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423329 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423338 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423349 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423357 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423366 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423375 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423383 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423391 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423400 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423409 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423417 4922 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423426 4922 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423434 4922 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423442 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423451 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423459 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423468 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423477 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423486 4922 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423495 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.423504 4922 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.478614 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.491164 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.506225 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.521226 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.534001 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.552367 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.552657 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.559864 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.568080 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.596589 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.611331 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.624567 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.826322 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.826460 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.826500 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:53:08.826470501 +0000 UTC m=+24.864992393 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.826548 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.826619 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:08.826602994 +0000 UTC m=+24.865124886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.927408 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.927458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:07 crc kubenswrapper[4922]: I1122 02:53:07.927495 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927627 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927700 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:08.927672504 +0000 UTC m=+24.966194396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927707 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927749 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927769 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927865 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:08.927829388 +0000 UTC m=+24.966351280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927709 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927906 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927917 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:07 crc kubenswrapper[4922]: E1122 02:53:07.927944 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:08.927938051 +0000 UTC m=+24.966459943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.066605 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.070794 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.076650 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.081413 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.094744 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.105710 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.117938 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.130220 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.148834 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.164902 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.181183 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.191665 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.203292 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.214368 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.225499 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.238988 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.249103 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.260276 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.299882 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.300017 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.459087 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9ac28f94efaa2e39af27f99097ae173f2b0e54a2998235485501f7aacc735712"} Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.460878 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193"} Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.460935 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6"} Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.460945 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1dfe5199b56e364240f869482de05eb4437ea3efd51332bba2a78c94e95ffd59"} Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.461922 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c"} Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.461982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3a62714c957e4fdddc30fd868b630bd2f608c3471edff2602e635f65c415043b"} Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.477835 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.500019 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.518736 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.533948 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.546329 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ntq2p"] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.546950 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.551778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.552226 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.552434 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.559665 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.582564 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.598613 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.615583 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.628662 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.632670 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/087141fd-fc9b-4685-bada-20260ee96369-hosts-file\") pod \"node-resolver-ntq2p\" (UID: \"087141fd-fc9b-4685-bada-20260ee96369\") " pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.632711 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4fck\" (UniqueName: \"kubernetes.io/projected/087141fd-fc9b-4685-bada-20260ee96369-kube-api-access-x4fck\") pod \"node-resolver-ntq2p\" (UID: \"087141fd-fc9b-4685-bada-20260ee96369\") " pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.641007 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.654379 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.671591 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.695679 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.708112 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.723509 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.733505 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4fck\" (UniqueName: \"kubernetes.io/projected/087141fd-fc9b-4685-bada-20260ee96369-kube-api-access-x4fck\") pod \"node-resolver-ntq2p\" (UID: \"087141fd-fc9b-4685-bada-20260ee96369\") " pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.733629 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/087141fd-fc9b-4685-bada-20260ee96369-hosts-file\") pod \"node-resolver-ntq2p\" (UID: \"087141fd-fc9b-4685-bada-20260ee96369\") " pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.733723 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/087141fd-fc9b-4685-bada-20260ee96369-hosts-file\") pod \"node-resolver-ntq2p\" (UID: \"087141fd-fc9b-4685-bada-20260ee96369\") " pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.747935 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.751541 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4fck\" (UniqueName: \"kubernetes.io/projected/087141fd-fc9b-4685-bada-20260ee96369-kube-api-access-x4fck\") pod \"node-resolver-ntq2p\" (UID: \"087141fd-fc9b-4685-bada-20260ee96369\") " pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.766974 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.834155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.834250 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.834351 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.834404 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:10.834388064 +0000 UTC m=+26.872909946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.834597 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:53:10.834584769 +0000 UTC m=+26.873106661 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.858098 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ntq2p" Nov 22 02:53:08 crc kubenswrapper[4922]: W1122 02:53:08.883302 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087141fd_fc9b_4685_bada_20260ee96369.slice/crio-34c40b35496599db5ff37dffa70cb59699487232f394ad001c29eee8c84ad148 WatchSource:0}: Error finding container 34c40b35496599db5ff37dffa70cb59699487232f394ad001c29eee8c84ad148: Status 404 returned error can't find the container with id 34c40b35496599db5ff37dffa70cb59699487232f394ad001c29eee8c84ad148 Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.934715 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.934976 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.934923 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.935122 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935109 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935229 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:10.935185028 +0000 UTC m=+26.973706910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935335 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935375 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935498 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:10.935461524 +0000 UTC m=+26.973983586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935597 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935666 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935724 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:08 crc kubenswrapper[4922]: E1122 02:53:08.935855 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:10.935817392 +0000 UTC m=+26.974339284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.952542 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-d4gbc"] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.953065 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d4gbc" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.957865 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.959053 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.960534 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.960797 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.960941 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.961210 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n664h"] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.962066 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.963628 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-b9j6n"] Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.964119 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.970048 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.972541 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.976084 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.976447 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.976475 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.976675 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.980992 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.981211 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.996547 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 22 02:53:08 crc kubenswrapper[4922]: I1122 02:53:08.996588 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.016148 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.035968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-hostroot\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036011 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-cnibin\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036030 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-system-cni-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036046 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-socket-dir-parent\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036068 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-multus-certs\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036084 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-cnibin\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036101 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-cni-bin\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-daemon-config\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036139 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036324 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/826eb92e-c839-4fba-9737-3b52101fec88-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036413 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-system-cni-dir\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036446 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-os-release\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036477 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/954bb7b8-d710-4e1a-973e-78c04e685f30-cni-binary-copy\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036528 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-netns\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-etc-kubernetes\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036585 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-cni-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-conf-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036642 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52z2k\" (UniqueName: \"kubernetes.io/projected/954bb7b8-d710-4e1a-973e-78c04e685f30-kube-api-access-52z2k\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-cni-multus\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-k8s-cni-cncf-io\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036712 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-kubelet\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-os-release\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036771 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/826eb92e-c839-4fba-9737-3b52101fec88-cni-binary-copy\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.036810 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nws4p\" (UniqueName: \"kubernetes.io/projected/826eb92e-c839-4fba-9737-3b52101fec88-kube-api-access-nws4p\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.044192 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.057363 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.086256 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.086599 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.116965 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.136372 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137480 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-cnibin\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-system-cni-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137538 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-socket-dir-parent\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/402683b1-a29f-4a79-a36c-daf6e8068d0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137591 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-multus-certs\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpl9p\" (UniqueName: \"kubernetes.io/projected/402683b1-a29f-4a79-a36c-daf6e8068d0d-kube-api-access-zpl9p\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137634 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-cnibin\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137661 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-cni-bin\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137679 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-daemon-config\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137670 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-cnibin\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137711 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-multus-certs\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137699 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.137915 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-cnibin\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138005 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/826eb92e-c839-4fba-9737-3b52101fec88-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138053 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-socket-dir-parent\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138087 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/402683b1-a29f-4a79-a36c-daf6e8068d0d-rootfs\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-system-cni-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-cni-bin\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-system-cni-dir\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138320 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-netns\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138356 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-etc-kubernetes\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-os-release\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138399 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-netns\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138424 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/954bb7b8-d710-4e1a-973e-78c04e685f30-cni-binary-copy\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-etc-kubernetes\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138456 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-cni-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138376 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-system-cni-dir\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138484 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-conf-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138511 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52z2k\" (UniqueName: \"kubernetes.io/projected/954bb7b8-d710-4e1a-973e-78c04e685f30-kube-api-access-52z2k\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138522 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-cni-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138541 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-cni-multus\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-k8s-cni-cncf-io\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-daemon-config\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138601 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-kubelet\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138624 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-cni-multus\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138637 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-run-k8s-cni-cncf-io\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138637 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-os-release\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138672 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/826eb92e-c839-4fba-9737-3b52101fec88-cni-binary-copy\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138696 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nws4p\" (UniqueName: \"kubernetes.io/projected/826eb92e-c839-4fba-9737-3b52101fec88-kube-api-access-nws4p\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138720 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/402683b1-a29f-4a79-a36c-daf6e8068d0d-proxy-tls\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138726 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-host-var-lib-kubelet\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138752 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-os-release\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-hostroot\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138743 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-hostroot\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138901 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-os-release\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138939 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/954bb7b8-d710-4e1a-973e-78c04e685f30-multus-conf-dir\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.138952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/826eb92e-c839-4fba-9737-3b52101fec88-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.139138 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/954bb7b8-d710-4e1a-973e-78c04e685f30-cni-binary-copy\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.139218 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/826eb92e-c839-4fba-9737-3b52101fec88-cni-binary-copy\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.139413 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/826eb92e-c839-4fba-9737-3b52101fec88-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.152145 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.165394 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.178641 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.190926 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.204710 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.219441 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.224173 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52z2k\" (UniqueName: \"kubernetes.io/projected/954bb7b8-d710-4e1a-973e-78c04e685f30-kube-api-access-52z2k\") pod \"multus-d4gbc\" (UID: \"954bb7b8-d710-4e1a-973e-78c04e685f30\") " pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.224331 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nws4p\" (UniqueName: \"kubernetes.io/projected/826eb92e-c839-4fba-9737-3b52101fec88-kube-api-access-nws4p\") pod \"multus-additional-cni-plugins-n664h\" (UID: \"826eb92e-c839-4fba-9737-3b52101fec88\") " pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.233070 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.239929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/402683b1-a29f-4a79-a36c-daf6e8068d0d-rootfs\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.240007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/402683b1-a29f-4a79-a36c-daf6e8068d0d-proxy-tls\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.240030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/402683b1-a29f-4a79-a36c-daf6e8068d0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.240048 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpl9p\" (UniqueName: \"kubernetes.io/projected/402683b1-a29f-4a79-a36c-daf6e8068d0d-kube-api-access-zpl9p\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.240080 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/402683b1-a29f-4a79-a36c-daf6e8068d0d-rootfs\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.241022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/402683b1-a29f-4a79-a36c-daf6e8068d0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.244455 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/402683b1-a29f-4a79-a36c-daf6e8068d0d-proxy-tls\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.256807 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.271126 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d4gbc" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.271459 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpl9p\" (UniqueName: \"kubernetes.io/projected/402683b1-a29f-4a79-a36c-daf6e8068d0d-kube-api-access-zpl9p\") pod \"machine-config-daemon-b9j6n\" (UID: \"402683b1-a29f-4a79-a36c-daf6e8068d0d\") " pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.288283 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: W1122 02:53:09.290714 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod954bb7b8_d710_4e1a_973e_78c04e685f30.slice/crio-bc6578f819edf768246d13174875eb6ac987ed8f4ad066698cba799ddadd56d3 WatchSource:0}: Error finding container bc6578f819edf768246d13174875eb6ac987ed8f4ad066698cba799ddadd56d3: Status 404 returned error can't find the container with id bc6578f819edf768246d13174875eb6ac987ed8f4ad066698cba799ddadd56d3 Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.291562 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n664h" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.300194 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.300698 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:09 crc kubenswrapper[4922]: E1122 02:53:09.300820 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.301319 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:09 crc kubenswrapper[4922]: E1122 02:53:09.301389 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.304443 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.305366 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.307064 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.310038 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.310735 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.311940 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.312515 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.314505 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.315142 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.316647 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.318948 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.319674 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.321181 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.322800 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.323364 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.324413 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.325011 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.327640 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.343187 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.357927 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.385362 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.397793 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c7wvg"] Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.399447 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.404478 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.404793 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.405116 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.405246 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.405381 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.405679 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.405908 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.406107 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.422448 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.434865 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.482076 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerStarted","Data":"bc6578f819edf768246d13174875eb6ac987ed8f4ad066698cba799ddadd56d3"} Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.493394 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.503717 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ntq2p" event={"ID":"087141fd-fc9b-4685-bada-20260ee96369","Type":"ContainerStarted","Data":"cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713"} Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.503764 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ntq2p" event={"ID":"087141fd-fc9b-4685-bada-20260ee96369","Type":"ContainerStarted","Data":"34c40b35496599db5ff37dffa70cb59699487232f394ad001c29eee8c84ad148"} Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.508184 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerStarted","Data":"d80d07db2b79c8d330a684aae53ed308b0113dcc4ccefb1904be9eb1eb8136ab"} Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.514784 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"1c6e79ee442013bc074ad4907c661f8093d95b2dce690296026e9823daadbe8e"} Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.520417 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543373 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-kubelet\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543417 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-var-lib-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-netd\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543454 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-systemd-units\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-ovn\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543492 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-etc-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543525 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543540 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-slash\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543585 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-systemd\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543602 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-bin\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-config\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543643 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-env-overrides\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543660 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovn-node-metrics-cert\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-netns\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543709 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-log-socket\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543727 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543746 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-node-log\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543764 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-script-lib\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.543783 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgzfn\" (UniqueName: \"kubernetes.io/projected/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-kube-api-access-zgzfn\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.554379 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.583473 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.598751 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.613174 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.627450 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.643667 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-slash\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644169 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-bin\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644198 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-systemd\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644213 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-config\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644229 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-env-overrides\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644244 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovn-node-metrics-cert\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644260 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-netns\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-log-socket\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644306 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-slash\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644356 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644317 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644375 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-systemd\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-node-log\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644405 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-node-log\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644465 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-script-lib\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644484 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgzfn\" (UniqueName: \"kubernetes.io/projected/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-kube-api-access-zgzfn\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644508 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-kubelet\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644531 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-var-lib-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644546 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-netd\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644534 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-log-socket\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644578 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-systemd-units\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644590 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-kubelet\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644619 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-netd\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644544 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-bin\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-systemd-units\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644806 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-ovn\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644807 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-var-lib-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644902 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-ovn\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.644973 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-env-overrides\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-etc-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645045 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-etc-openvswitch\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645048 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-netns\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645095 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645143 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645193 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-config\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645196 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-ovn-kubernetes\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645224 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.645298 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-script-lib\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.649285 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovn-node-metrics-cert\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.663483 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgzfn\" (UniqueName: \"kubernetes.io/projected/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-kube-api-access-zgzfn\") pod \"ovnkube-node-c7wvg\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.670995 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.682163 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.696389 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.714526 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.728925 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.732393 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:09 crc kubenswrapper[4922]: W1122 02:53:09.745006 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a6bcd8_bb13_463b_b112_0df3cf90b5f7.slice/crio-a7aace6d771bcddf4bbb75a7470e9ac820327bbfe0fb2b922a7aea241f3a641c WatchSource:0}: Error finding container a7aace6d771bcddf4bbb75a7470e9ac820327bbfe0fb2b922a7aea241f3a641c: Status 404 returned error can't find the container with id a7aace6d771bcddf4bbb75a7470e9ac820327bbfe0fb2b922a7aea241f3a641c Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.747613 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.770866 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.787880 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.803107 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.814272 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.830317 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.852427 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.868862 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.891110 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.902888 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.921933 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.941703 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:09 crc kubenswrapper[4922]: I1122 02:53:09.981867 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.022030 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.068268 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.299667 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.299820 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.524485 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerStarted","Data":"f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.527644 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.527697 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.529290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.531078 4922 generic.go:334] "Generic (PLEG): container finished" podID="826eb92e-c839-4fba-9737-3b52101fec88" containerID="308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721" exitCode=0 Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.531116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerDied","Data":"308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.533436 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" exitCode=0 Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.533525 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.533654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"a7aace6d771bcddf4bbb75a7470e9ac820327bbfe0fb2b922a7aea241f3a641c"} Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.545727 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.562250 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.585683 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.603026 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.617262 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.632419 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.646740 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.661361 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.681900 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.704405 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.719085 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.731491 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.749175 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.768375 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.783623 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.799953 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.815075 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.838123 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.852913 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.858423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.858596 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:53:14.858568227 +0000 UTC m=+30.897090109 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.858699 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.858897 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.858941 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:14.858931776 +0000 UTC m=+30.897453668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.893477 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.936445 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.959876 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.959920 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.959944 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960065 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960091 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960133 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960145 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960157 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960116 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:14.960099527 +0000 UTC m=+30.998621419 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960196 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960213 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960222 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:14.96020274 +0000 UTC m=+30.998724632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:10 crc kubenswrapper[4922]: E1122 02:53:10.960271 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:14.960250381 +0000 UTC m=+30.998772273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.962482 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:10 crc kubenswrapper[4922]: I1122 02:53:10.986004 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:10Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.022314 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.065883 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.103693 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.145470 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.194728 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.300053 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.300214 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:11 crc kubenswrapper[4922]: E1122 02:53:11.300601 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:11 crc kubenswrapper[4922]: E1122 02:53:11.300761 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.305767 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-df5ll"] Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.306278 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.308323 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.308330 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.309927 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.309965 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.321442 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.341865 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.383016 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.422756 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.462126 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.466547 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klcgp\" (UniqueName: \"kubernetes.io/projected/b5672d56-8abd-4aa4-ac8d-1655896397f8-kube-api-access-klcgp\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.466603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5672d56-8abd-4aa4-ac8d-1655896397f8-host\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.466642 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5672d56-8abd-4aa4-ac8d-1655896397f8-serviceca\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.506798 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.542614 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.546053 4922 generic.go:334] "Generic (PLEG): container finished" podID="826eb92e-c839-4fba-9737-3b52101fec88" containerID="4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954" exitCode=0 Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.546787 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerDied","Data":"4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954"} Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.551319 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.551375 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.551387 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.569106 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klcgp\" (UniqueName: \"kubernetes.io/projected/b5672d56-8abd-4aa4-ac8d-1655896397f8-kube-api-access-klcgp\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.569195 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5672d56-8abd-4aa4-ac8d-1655896397f8-host\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.569338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5672d56-8abd-4aa4-ac8d-1655896397f8-serviceca\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.569504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b5672d56-8abd-4aa4-ac8d-1655896397f8-host\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.570878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b5672d56-8abd-4aa4-ac8d-1655896397f8-serviceca\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.586541 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.622520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klcgp\" (UniqueName: \"kubernetes.io/projected/b5672d56-8abd-4aa4-ac8d-1655896397f8-kube-api-access-klcgp\") pod \"node-ca-df5ll\" (UID: \"b5672d56-8abd-4aa4-ac8d-1655896397f8\") " pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.649408 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.662687 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-df5ll" Nov 22 02:53:11 crc kubenswrapper[4922]: W1122 02:53:11.681155 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5672d56_8abd_4aa4_ac8d_1655896397f8.slice/crio-1ec03e651423399891121d5a935f2a38393a1bd924607dd65e5049484d4a5cc6 WatchSource:0}: Error finding container 1ec03e651423399891121d5a935f2a38393a1bd924607dd65e5049484d4a5cc6: Status 404 returned error can't find the container with id 1ec03e651423399891121d5a935f2a38393a1bd924607dd65e5049484d4a5cc6 Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.687161 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.724886 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.764925 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.805436 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.841362 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.887753 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.931637 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:11 crc kubenswrapper[4922]: I1122 02:53:11.966906 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:11Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.002833 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.044491 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.085631 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.123301 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.163819 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.204192 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.257208 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.283220 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.299590 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:12 crc kubenswrapper[4922]: E1122 02:53:12.299820 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.322482 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.362437 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.406200 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.444027 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.503437 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.556185 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-df5ll" event={"ID":"b5672d56-8abd-4aa4-ac8d-1655896397f8","Type":"ContainerStarted","Data":"62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8"} Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.556256 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-df5ll" event={"ID":"b5672d56-8abd-4aa4-ac8d-1655896397f8","Type":"ContainerStarted","Data":"1ec03e651423399891121d5a935f2a38393a1bd924607dd65e5049484d4a5cc6"} Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.559500 4922 generic.go:334] "Generic (PLEG): container finished" podID="826eb92e-c839-4fba-9737-3b52101fec88" containerID="4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7" exitCode=0 Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.559573 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerDied","Data":"4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7"} Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.569767 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.569825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.569853 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.576327 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.592118 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.604293 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.645926 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.680870 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.723649 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.762817 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.801458 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.849420 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.897913 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.928413 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:12 crc kubenswrapper[4922]: I1122 02:53:12.967933 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:12Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.012457 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.049990 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.087660 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.122053 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.169679 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.230344 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.250640 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.293711 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.299573 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.299607 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:13 crc kubenswrapper[4922]: E1122 02:53:13.299747 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:13 crc kubenswrapper[4922]: E1122 02:53:13.299940 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.327632 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.366007 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.403501 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.460788 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.487045 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.527620 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.572345 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.585136 4922 generic.go:334] "Generic (PLEG): container finished" podID="826eb92e-c839-4fba-9737-3b52101fec88" containerID="0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9" exitCode=0 Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.585193 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerDied","Data":"0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9"} Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.610894 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.652181 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.691660 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.730974 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.770079 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.810398 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.854725 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.884980 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.921602 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:13 crc kubenswrapper[4922]: I1122 02:53:13.965780 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:13Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.003862 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.044376 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.078895 4922 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.081544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.081598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.081616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.081778 4922 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.087126 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.135353 4922 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.135657 4922 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.137366 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.137442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.137465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.137493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.137512 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.158375 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.165464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.165501 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.165516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.165540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.165553 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.171363 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.183167 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.187899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.187984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.188010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.188042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.188065 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.204668 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.206736 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.210066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.210119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.210146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.210179 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.210201 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.226394 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.231376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.231417 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.231428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.231449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.231462 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.248653 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.251660 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.251913 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.254128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.254180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.254201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.254226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.254246 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.282240 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.299454 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.299559 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.331623 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.357272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.357349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.357368 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.357399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.357425 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.460941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.461005 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.461018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.461038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.461053 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.564607 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.564670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.564688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.564716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.564733 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.603040 4922 generic.go:334] "Generic (PLEG): container finished" podID="826eb92e-c839-4fba-9737-3b52101fec88" containerID="02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4" exitCode=0 Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.603223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerDied","Data":"02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.615900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.625612 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.645883 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.666293 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.668066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.668110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.668119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.668135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.668146 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.686535 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.705422 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.722284 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.738962 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.760427 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.770676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.770730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.770743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.770762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.770776 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.782838 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.800089 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.828255 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.849811 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.862596 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.873755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.874078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.874224 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.874399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.874536 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.890922 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.911040 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.911201 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.911329 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.911413 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:22.911391987 +0000 UTC m=+38.949913909 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:14 crc kubenswrapper[4922]: E1122 02:53:14.911675 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:53:22.911638483 +0000 UTC m=+38.950160415 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.927211 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:14Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.978388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.978452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.978470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.978495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:14 crc kubenswrapper[4922]: I1122 02:53:14.978515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:14Z","lastTransitionTime":"2025-11-22T02:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.012266 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.012341 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.012399 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012491 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012524 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012540 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012610 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:23.012587421 +0000 UTC m=+39.051109323 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012663 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012683 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012742 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012769 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012745 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:23.012721034 +0000 UTC m=+39.051242956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.012965 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:23.012916918 +0000 UTC m=+39.051438870 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.082070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.082136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.082155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.082214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.082235 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.186800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.186940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.186976 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.187009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.187030 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.290504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.290553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.290564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.290582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.290592 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.300097 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.300129 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.300227 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:15 crc kubenswrapper[4922]: E1122 02:53:15.300360 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.321425 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.339688 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.375342 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.393609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.393677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.393702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.393736 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.393755 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.404666 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.421743 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.441129 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.457412 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.474244 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.497918 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.498193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.498232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.498245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.498265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.498277 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.510522 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.529962 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.543584 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.557398 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.577334 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.602498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.602561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.602583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.602611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.602630 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.606734 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.626107 4922 generic.go:334] "Generic (PLEG): container finished" podID="826eb92e-c839-4fba-9737-3b52101fec88" containerID="f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630" exitCode=0 Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.626181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerDied","Data":"f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.650988 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.669729 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.693478 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.706627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.706782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.706907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.706941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.707006 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.714829 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.738162 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.767728 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.808722 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.810897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.810947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.810957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.810975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.810989 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.846217 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.891737 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.914720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.914779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.914793 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.914817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.914830 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:15Z","lastTransitionTime":"2025-11-22T02:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.928271 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:15 crc kubenswrapper[4922]: I1122 02:53:15.965724 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.010364 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.017876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.017912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.017922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.017940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.017957 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.044509 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.088394 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.122089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.122535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.122723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.123009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.123200 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.127069 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.226732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.227403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.227425 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.227456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.227474 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.299486 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:16 crc kubenswrapper[4922]: E1122 02:53:16.299654 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.329755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.329803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.329815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.329837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.329871 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.432961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.433031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.433051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.433082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.433105 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.535757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.535915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.535945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.535981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.536007 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.638724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.638773 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" event={"ID":"826eb92e-c839-4fba-9737-3b52101fec88","Type":"ContainerStarted","Data":"bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.638794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.638897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.638922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.638941 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.663269 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.682599 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.703663 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.718520 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.734334 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.742273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.742330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.742345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.742371 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.742389 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.751437 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.766471 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.788375 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.826651 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.845890 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.845944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.845959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.845983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.846001 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.848834 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.864355 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.881355 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.896907 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.910566 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.932808 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.948706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.948758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.948781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.948806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:16 crc kubenswrapper[4922]: I1122 02:53:16.948825 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:16Z","lastTransitionTime":"2025-11-22T02:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.052352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.052468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.052493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.052519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.052537 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.156481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.156554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.156575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.156605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.156629 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.261023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.261073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.261084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.261106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.261139 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.299574 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.299600 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:17 crc kubenswrapper[4922]: E1122 02:53:17.299767 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:17 crc kubenswrapper[4922]: E1122 02:53:17.300054 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.363612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.363676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.363695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.363722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.363741 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.467253 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.467321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.467357 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.467392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.467408 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.570025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.570075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.570091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.570111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.570127 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.666144 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.666667 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.672795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.672916 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.672954 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.672997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.673028 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.693992 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.704593 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.710753 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.726408 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.743354 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.762394 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.775601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.775640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.775650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.775668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.775679 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.787619 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.804104 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.822202 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.850378 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.872743 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.878348 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.878383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.878395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.878413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.878425 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.888171 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.902872 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.917113 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.934487 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.947370 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.965691 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.980761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.980812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.980829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.980868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.980908 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:17Z","lastTransitionTime":"2025-11-22T02:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.982597 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:17 crc kubenswrapper[4922]: I1122 02:53:17.999447 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:17Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.020685 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.037082 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.049574 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.065383 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.077265 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.083118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.083156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.083169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.083191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.083205 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.092994 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.108903 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.124919 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.142352 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.164994 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.183075 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.186306 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.186378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.186397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.186426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.186455 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.198197 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.289344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.289403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.289413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.289429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.289440 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.299816 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:18 crc kubenswrapper[4922]: E1122 02:53:18.300099 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.393063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.393128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.393146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.393176 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.393195 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.495716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.495804 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.495824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.495898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.495927 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.598986 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.599063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.599082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.599110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.599129 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.670045 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.670806 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.702389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.702447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.702465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.702489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.702509 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.703138 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.725535 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.748410 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.772241 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.793418 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.806119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.806162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.806175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.806193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.806210 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.816415 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.833182 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.852291 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.867377 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.887648 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.908076 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.909189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.909231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.909245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.909262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.909274 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:18Z","lastTransitionTime":"2025-11-22T02:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.938353 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.964027 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.979924 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:18 crc kubenswrapper[4922]: I1122 02:53:18.992362 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:18Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.009498 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:19Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.011600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.011651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.011669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.011692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.011711 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.115019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.115070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.115082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.115099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.115109 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.217643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.217731 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.217749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.217772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.217809 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.299969 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.300007 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:19 crc kubenswrapper[4922]: E1122 02:53:19.300110 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:19 crc kubenswrapper[4922]: E1122 02:53:19.300220 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.320800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.320838 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.320868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.320884 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.320894 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.423734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.423772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.423783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.423800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.423811 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.526538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.527271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.527293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.527350 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.527374 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.630807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.630921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.630951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.630982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.631005 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.674402 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.734346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.734429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.734449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.734480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.734500 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.837909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.837978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.837997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.838026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.838046 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.941371 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.941451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.941478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.941509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:19 crc kubenswrapper[4922]: I1122 02:53:19.941529 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:19Z","lastTransitionTime":"2025-11-22T02:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.044779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.044906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.044929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.044957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.044977 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.148879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.148923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.148941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.148965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.148978 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.252398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.252512 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.252535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.252566 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.252594 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.300448 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:20 crc kubenswrapper[4922]: E1122 02:53:20.300639 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.356139 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.356214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.356227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.356257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.356271 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.459378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.459449 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.459468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.459498 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.459518 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.563432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.563507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.563524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.563550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.563568 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.666438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.666475 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.666484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.666502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.666512 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.680540 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/0.log" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.683914 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e" exitCode=1 Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.683977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.685154 4922 scope.go:117] "RemoveContainer" containerID="1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.710734 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.732525 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.752005 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.766145 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.768903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.769010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.769025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.769042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.769058 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.788261 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.806968 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.821147 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.837301 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.872405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.872487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.872501 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.872521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.872535 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.873691 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:19Z\\\",\\\"message\\\":\\\"I1122 02:53:19.655290 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:53:19.655314 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:53:19.655369 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 02:53:19.655379 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 02:53:19.655460 6237 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 02:53:19.655472 6237 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 02:53:19.656061 6237 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 02:53:19.656276 6237 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:53:19.656355 6237 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 02:53:19.656490 6237 factory.go:656] Stopping watch factory\\\\nI1122 02:53:19.656492 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:53:19.656527 6237 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:53:19.656292 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:53:19.656304 6237 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:53:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.891480 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.909555 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.932395 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.950770 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.966384 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.975602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.975669 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.975687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.975718 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.975741 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:20Z","lastTransitionTime":"2025-11-22T02:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:20 crc kubenswrapper[4922]: I1122 02:53:20.984636 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:20Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.079648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.079694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.079709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.079726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.079739 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.183000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.183043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.183052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.183068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.183079 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.290759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.290891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.290913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.290943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.290970 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.299726 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.299797 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:21 crc kubenswrapper[4922]: E1122 02:53:21.300020 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:21 crc kubenswrapper[4922]: E1122 02:53:21.300324 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.393723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.393790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.393809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.393834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.393889 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.496990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.497063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.497081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.497108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.497128 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.545043 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92"] Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.545909 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.549103 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.549149 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.571348 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.588699 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.594387 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7cbt\" (UniqueName: \"kubernetes.io/projected/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-kube-api-access-f7cbt\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.594482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.594522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.594607 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.600480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.600526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.600535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.600552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.600563 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.614063 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.633277 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.661908 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.683799 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.689452 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/0.log" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.693229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.693398 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.695542 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.695582 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.695640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.695679 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7cbt\" (UniqueName: \"kubernetes.io/projected/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-kube-api-access-f7cbt\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.696884 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.697621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.705395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.705442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.705460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.705480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.705496 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.706460 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.714773 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.723051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7cbt\" (UniqueName: \"kubernetes.io/projected/da6031ac-2dd7-43c9-b9e2-4c6394e23c1f-kube-api-access-f7cbt\") pod \"ovnkube-control-plane-749d76644c-v8z92\" (UID: \"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.747139 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:19Z\\\",\\\"message\\\":\\\"I1122 02:53:19.655290 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:53:19.655314 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:53:19.655369 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 02:53:19.655379 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 02:53:19.655460 6237 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 02:53:19.655472 6237 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 02:53:19.656061 6237 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 02:53:19.656276 6237 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:53:19.656355 6237 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 02:53:19.656490 6237 factory.go:656] Stopping watch factory\\\\nI1122 02:53:19.656492 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:53:19.656527 6237 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:53:19.656292 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:53:19.656304 6237 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:53:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.766383 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.780495 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.794809 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.807794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.807863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.807875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.807892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.807919 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.812293 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.826242 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.844215 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.858363 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.859416 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.874046 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.902568 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.911171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.911213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.911223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.911241 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.911252 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:21Z","lastTransitionTime":"2025-11-22T02:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.921008 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.946483 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.962557 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.979548 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:21 crc kubenswrapper[4922]: I1122 02:53:21.992185 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:21Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.012943 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.014344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.014446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.014463 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.015017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.015041 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.032099 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.052943 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.081196 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:19Z\\\",\\\"message\\\":\\\"I1122 02:53:19.655290 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:53:19.655314 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:53:19.655369 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 02:53:19.655379 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 02:53:19.655460 6237 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 02:53:19.655472 6237 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 02:53:19.656061 6237 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 02:53:19.656276 6237 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:53:19.656355 6237 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 02:53:19.656490 6237 factory.go:656] Stopping watch factory\\\\nI1122 02:53:19.656492 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:53:19.656527 6237 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:53:19.656292 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:53:19.656304 6237 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:53:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.100073 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.118266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.118312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.118323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.118342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.118353 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.119629 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.136604 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.151821 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.166549 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.182248 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.221250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.221300 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.221312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.221330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.221342 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.299681 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:22 crc kubenswrapper[4922]: E1122 02:53:22.299805 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.324481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.324520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.324530 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.324545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.324555 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.430237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.430298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.430314 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.430341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.430356 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.533945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.534396 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.534415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.534436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.534451 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.638355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.638424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.638442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.638469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.638488 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.701081 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" event={"ID":"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f","Type":"ContainerStarted","Data":"0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.701158 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" event={"ID":"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f","Type":"ContainerStarted","Data":"f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.701179 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" event={"ID":"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f","Type":"ContainerStarted","Data":"d323274c8b65c17a517632cee7a6d40378facb9bdf88cbdb4157e280479a4439"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.704149 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/1.log" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.705631 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/0.log" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.717251 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89" exitCode=1 Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.717327 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.717440 4922 scope.go:117] "RemoveContainer" containerID="1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.718278 4922 scope.go:117] "RemoveContainer" containerID="81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89" Nov 22 02:53:22 crc kubenswrapper[4922]: E1122 02:53:22.718460 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.726515 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.742397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.742466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.742488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.742519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.742538 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.746163 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.764220 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.790377 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:19Z\\\",\\\"message\\\":\\\"I1122 02:53:19.655290 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:53:19.655314 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:53:19.655369 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 02:53:19.655379 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 02:53:19.655460 6237 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 02:53:19.655472 6237 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 02:53:19.656061 6237 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 02:53:19.656276 6237 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:53:19.656355 6237 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 02:53:19.656490 6237 factory.go:656] Stopping watch factory\\\\nI1122 02:53:19.656492 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:53:19.656527 6237 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:53:19.656292 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:53:19.656304 6237 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:53:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.812338 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.831343 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.846206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.846253 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.846268 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.846288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.846304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.850251 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.873727 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.892945 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.906970 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.922936 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.950031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.950087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.950104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.950129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.950147 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:22Z","lastTransitionTime":"2025-11-22T02:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.959143 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:22 crc kubenswrapper[4922]: I1122 02:53:22.984239 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:22Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.007184 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.009699 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.010102 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:53:39.010037083 +0000 UTC m=+55.048559025 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.010206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.010311 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.010390 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:39.010368272 +0000 UTC m=+55.048890174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.022015 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.041629 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.054067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.054146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.054160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.054181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.054195 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.062856 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.073640 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2gmkj"] Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.074429 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.074524 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.082203 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.102819 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.111305 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.111356 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pk97\" (UniqueName: \"kubernetes.io/projected/d5c8000a-a783-474f-a73a-55814c257a02-kube-api-access-8pk97\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.111393 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.111424 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.111463 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111598 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111651 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:39.111635696 +0000 UTC m=+55.150157598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111645 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111713 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111735 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111745 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111770 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111785 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111851 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:39.11180154 +0000 UTC m=+55.150323472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.111947 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:39.111926143 +0000 UTC m=+55.150448195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.128449 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:19Z\\\",\\\"message\\\":\\\"I1122 02:53:19.655290 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:53:19.655314 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:53:19.655369 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 02:53:19.655379 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 02:53:19.655460 6237 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 02:53:19.655472 6237 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 02:53:19.656061 6237 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 02:53:19.656276 6237 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:53:19.656355 6237 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 02:53:19.656490 6237 factory.go:656] Stopping watch factory\\\\nI1122 02:53:19.656492 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:53:19.656527 6237 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:53:19.656292 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:53:19.656304 6237 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:53:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.146034 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.157042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.157122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.157141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.157171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.157190 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.166933 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.184153 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.202941 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.212815 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pk97\" (UniqueName: \"kubernetes.io/projected/d5c8000a-a783-474f-a73a-55814c257a02-kube-api-access-8pk97\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.212973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.213137 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.213207 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:23.713187148 +0000 UTC m=+39.751709050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.219040 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.232678 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.242473 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pk97\" (UniqueName: \"kubernetes.io/projected/d5c8000a-a783-474f-a73a-55814c257a02-kube-api-access-8pk97\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.257116 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.260910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.260981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.261003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.261029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.261047 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.273633 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.295899 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.300287 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.300518 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.300542 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.301085 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.311558 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.333199 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.348190 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.363883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.363914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.363924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.363941 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.363950 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.372144 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.389006 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.412953 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b96c28595639547cc0b90244e7855e63664694e0074005a310637ce4ad5b72e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:19Z\\\",\\\"message\\\":\\\"I1122 02:53:19.655290 6237 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 02:53:19.655314 6237 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 02:53:19.655369 6237 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 02:53:19.655379 6237 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 02:53:19.655460 6237 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 02:53:19.655472 6237 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 02:53:19.656061 6237 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 02:53:19.656276 6237 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 02:53:19.656355 6237 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 02:53:19.656291 6237 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 02:53:19.656490 6237 factory.go:656] Stopping watch factory\\\\nI1122 02:53:19.656492 6237 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 02:53:19.656527 6237 ovnkube.go:599] Stopped ovnkube\\\\nI1122 02:53:19.656292 6237 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 02:53:19.656304 6237 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 02:53:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.428082 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.447123 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.467959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.468059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.468075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.468097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.468114 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.473363 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.487464 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.504605 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.525618 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.551956 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.571265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.571303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.571316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.571335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.571350 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.572148 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.596702 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.616067 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.651202 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.671147 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.674556 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.674601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.674614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.674636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.674654 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.694935 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.718324 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.718466 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.718532 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:24.718511576 +0000 UTC m=+40.757033478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.720328 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.723837 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/1.log" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.728239 4922 scope.go:117] "RemoveContainer" containerID="81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89" Nov 22 02:53:23 crc kubenswrapper[4922]: E1122 02:53:23.728414 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.746740 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.773694 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.778067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.778178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.778200 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.778234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.778254 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.810003 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.828551 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.847756 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.866177 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.881342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.881441 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.881471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.881515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.881540 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.882140 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.895805 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.910356 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.923651 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.948046 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.976542 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.985088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.985125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.985137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.985154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.985166 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:23Z","lastTransitionTime":"2025-11-22T02:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:23 crc kubenswrapper[4922]: I1122 02:53:23.998714 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:23Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.014879 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.036079 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.054738 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.073684 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.088210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.088274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.088285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.088302 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.088313 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.192347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.192429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.192461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.192492 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.192515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.296131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.296210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.296229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.296260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.296296 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.299583 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.299793 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.400583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.400651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.400671 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.400695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.400720 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.503272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.503351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.503373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.503407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.503435 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.527620 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.534566 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.534637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.534663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.534695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.534718 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.557020 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.562799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.562893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.562912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.562936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.562954 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.582565 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.588716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.588786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.588799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.588823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.588836 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.614063 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.619654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.619745 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.619772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.619807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.619831 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.641086 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:24Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.641230 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.643362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.643411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.643426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.643462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.643477 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.729937 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.730250 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:24 crc kubenswrapper[4922]: E1122 02:53:24.730400 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:26.730360132 +0000 UTC m=+42.768882074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.746411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.746480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.746502 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.746528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.746548 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.850133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.850183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.850199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.850218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.850231 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.953524 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.953585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.953598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.953619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:24 crc kubenswrapper[4922]: I1122 02:53:24.953634 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:24Z","lastTransitionTime":"2025-11-22T02:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.056780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.056832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.056879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.056901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.056916 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.161260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.161383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.161407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.161434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.161456 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.264939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.265010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.265036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.265072 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.265094 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.300411 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.300533 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:25 crc kubenswrapper[4922]: E1122 02:53:25.300635 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.300771 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:25 crc kubenswrapper[4922]: E1122 02:53:25.300987 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:25 crc kubenswrapper[4922]: E1122 02:53:25.301185 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.318684 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.340316 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.356837 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.368364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.368408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.368418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.368435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.368448 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.377834 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.400356 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.412465 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.425524 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.444177 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.467930 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.470699 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.470742 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.470754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.470775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.470791 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.485296 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.496837 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.508353 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.541229 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.556026 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.573902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.573962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.573979 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.574005 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.574024 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.575001 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.588575 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.607744 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:25Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.678168 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.678243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.678267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.678303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.678329 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.782374 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.783064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.783105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.783134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.783154 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.886777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.886888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.886918 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.886951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.886974 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.990721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.990784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.990807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.990835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:25 crc kubenswrapper[4922]: I1122 02:53:25.990894 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:25Z","lastTransitionTime":"2025-11-22T02:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.094074 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.094143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.094156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.094177 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.094192 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.198315 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.198746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.198878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.199021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.199108 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.300465 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:26 crc kubenswrapper[4922]: E1122 02:53:26.300749 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.302436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.302513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.302541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.302576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.302593 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.406583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.406684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.406707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.406738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.406768 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.510548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.510635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.510659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.510697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.510724 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.614044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.614122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.614141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.614169 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.614187 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.717839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.717951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.717978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.718015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.718041 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.755486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:26 crc kubenswrapper[4922]: E1122 02:53:26.755791 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:26 crc kubenswrapper[4922]: E1122 02:53:26.756030 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:30.755979928 +0000 UTC m=+46.794501980 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.821937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.822008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.822028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.822057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.822075 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.925462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.925547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.925566 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.925598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:26 crc kubenswrapper[4922]: I1122 02:53:26.925619 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:26Z","lastTransitionTime":"2025-11-22T02:53:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.028892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.028965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.028984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.029018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.029040 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.132980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.133059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.133078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.133145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.133172 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.236720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.236787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.236805 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.236835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.236900 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.300055 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.300055 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.300075 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:27 crc kubenswrapper[4922]: E1122 02:53:27.300338 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:27 crc kubenswrapper[4922]: E1122 02:53:27.300518 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:27 crc kubenswrapper[4922]: E1122 02:53:27.300705 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.340180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.340259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.340278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.340309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.340331 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.443376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.443455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.443479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.443510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.443535 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.547670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.547733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.547748 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.547774 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.547789 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.652037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.652175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.652196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.652235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.652262 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.755822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.756164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.756186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.756218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.756243 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.860244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.860365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.860392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.860419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.860437 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.963458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.963531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.963551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.963576 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:27 crc kubenswrapper[4922]: I1122 02:53:27.963595 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:27Z","lastTransitionTime":"2025-11-22T02:53:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.067706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.067822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.067875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.067915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.067977 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.170582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.170626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.170638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.170655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.170667 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.272738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.272794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.272803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.272819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.272830 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.300190 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:28 crc kubenswrapper[4922]: E1122 02:53:28.300322 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.375393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.375432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.375440 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.375453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.375464 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.478876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.478922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.478935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.478953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.479017 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.587038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.587113 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.587133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.587160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.587181 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.689623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.689717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.689780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.689813 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.689924 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.794049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.794134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.794158 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.794190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.794215 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.898160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.898199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.898208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.898242 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:28 crc kubenswrapper[4922]: I1122 02:53:28.898254 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:28Z","lastTransitionTime":"2025-11-22T02:53:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.001140 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.001274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.001299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.001391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.001423 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.104744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.104887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.104915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.105004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.105039 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.208715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.208790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.208809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.208843 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.208897 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.300717 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.300736 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:29 crc kubenswrapper[4922]: E1122 02:53:29.300987 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.300734 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:29 crc kubenswrapper[4922]: E1122 02:53:29.301079 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:29 crc kubenswrapper[4922]: E1122 02:53:29.301172 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.310666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.310719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.310732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.310753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.310776 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.414095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.414146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.414157 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.414171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.414182 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.517685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.517789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.517815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.517914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.517947 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.621772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.621901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.621927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.621955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.621977 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.725424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.725486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.725504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.725527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.725544 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.828141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.828181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.828190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.828203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.828214 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.932197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.932275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.932293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.932323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:29 crc kubenswrapper[4922]: I1122 02:53:29.932342 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:29Z","lastTransitionTime":"2025-11-22T02:53:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.036147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.036206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.036219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.036240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.036253 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.139273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.139665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.139677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.139694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.139706 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.243580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.243655 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.243673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.243704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.243723 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.299572 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:30 crc kubenswrapper[4922]: E1122 02:53:30.299801 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.347317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.347400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.347422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.347453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.347476 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.450752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.450794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.450805 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.450821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.450832 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.553686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.553754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.553776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.553803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.553825 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.656657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.656701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.656711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.656727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.656738 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.759815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.759912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.759932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.759959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.759980 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.804576 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:30 crc kubenswrapper[4922]: E1122 02:53:30.804718 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:30 crc kubenswrapper[4922]: E1122 02:53:30.804767 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:38.804753473 +0000 UTC m=+54.843275365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.864631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.864707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.864728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.864766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.864789 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.967644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.967730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.967754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.967787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:30 crc kubenswrapper[4922]: I1122 02:53:30.967812 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:30Z","lastTransitionTime":"2025-11-22T02:53:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.071411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.071532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.071555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.071623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.071644 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.174154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.174236 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.174260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.174293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.174318 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.277319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.277392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.277410 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.277437 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.277455 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.299733 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.299832 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.299832 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:31 crc kubenswrapper[4922]: E1122 02:53:31.299971 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:31 crc kubenswrapper[4922]: E1122 02:53:31.300319 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:31 crc kubenswrapper[4922]: E1122 02:53:31.300521 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.380180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.380240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.380254 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.380282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.380305 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.483622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.483702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.483721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.483754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.483773 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.587091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.587173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.587194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.587218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.587237 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.690058 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.690099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.690113 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.690133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.690147 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.793001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.793066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.793077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.793095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.793109 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.896084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.896154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.896175 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.896204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.896225 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.999518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.999565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.999573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.999588 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:31 crc kubenswrapper[4922]: I1122 02:53:31.999599 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:31Z","lastTransitionTime":"2025-11-22T02:53:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.103307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.103371 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.103379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.103395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.103410 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.206939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.207024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.207048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.207084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.207106 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.299552 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:32 crc kubenswrapper[4922]: E1122 02:53:32.299807 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.310515 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.310629 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.310658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.310696 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.310725 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.414250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.414321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.414337 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.414362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.414380 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.518580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.518658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.518680 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.518709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.518729 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.634756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.634817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.634829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.634870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.634893 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.737442 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.737712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.737833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.737945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.738015 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.840945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.841007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.841025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.841050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.841069 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.944286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.944343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.944360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.944384 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:32 crc kubenswrapper[4922]: I1122 02:53:32.944402 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:32Z","lastTransitionTime":"2025-11-22T02:53:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.047204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.047242 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.047252 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.047269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.047280 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.149341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.149390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.149405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.149421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.149434 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.252942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.253008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.253024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.253045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.253061 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.300562 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.300619 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:33 crc kubenswrapper[4922]: E1122 02:53:33.300804 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.300925 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:33 crc kubenswrapper[4922]: E1122 02:53:33.301023 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:33 crc kubenswrapper[4922]: E1122 02:53:33.301149 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.355800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.355874 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.355888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.355913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.355928 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.458888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.458932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.458942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.458962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.458973 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.561555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.561601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.561612 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.561628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.561640 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.664356 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.664397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.664405 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.664418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.664428 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.766341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.766423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.766447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.766480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.766515 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.869895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.869957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.869973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.869995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.870013 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.973467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.973563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.973579 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.973598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:33 crc kubenswrapper[4922]: I1122 02:53:33.973613 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:33Z","lastTransitionTime":"2025-11-22T02:53:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.076633 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.076677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.076688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.076706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.076718 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.179531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.179603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.179618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.179643 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.179659 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.283758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.283821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.283833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.283887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.283903 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.300275 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.300445 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.386004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.386044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.386052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.386065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.386072 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.489363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.489418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.489433 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.489456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.489471 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.593153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.593218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.593237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.593259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.593277 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.696361 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.696427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.696450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.696477 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.696497 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.707567 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.707663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.707690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.707724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.707749 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.727653 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:34Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.740897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.741012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.741045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.741089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.741119 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.761245 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:34Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.768039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.768121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.768136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.768156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.768169 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.786968 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:34Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.792685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.792762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.792780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.792808 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.792829 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.810680 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:34Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.816915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.817007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.817027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.817053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.817074 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.834375 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:34Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:34 crc kubenswrapper[4922]: E1122 02:53:34.834553 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.837149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.837218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.837243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.837274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.837296 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.940944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.941107 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.941130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.941202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:34 crc kubenswrapper[4922]: I1122 02:53:34.941224 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:34Z","lastTransitionTime":"2025-11-22T02:53:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.046083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.046198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.046218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.046284 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.046316 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.150677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.150752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.150771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.150797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.150816 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.254511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.254570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.254581 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.254598 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.254609 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.300222 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.300324 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.300396 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:35 crc kubenswrapper[4922]: E1122 02:53:35.300468 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:35 crc kubenswrapper[4922]: E1122 02:53:35.300630 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:35 crc kubenswrapper[4922]: E1122 02:53:35.301242 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.301964 4922 scope.go:117] "RemoveContainer" containerID="81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.321901 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.337993 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.355315 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.359927 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.359996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.360018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.360047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.360069 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.376398 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.400730 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.423612 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.449166 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.463641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.463703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.463717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.463740 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.463757 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.491147 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.520102 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.550888 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.568412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.568457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.568468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.568489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.568502 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.571928 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.602500 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.627405 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.650375 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.671740 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.672390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.672436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.672455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.672481 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.672500 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.690282 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.715969 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.776928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.777032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.777090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.777122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.777184 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.779361 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/1.log" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.784402 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.784623 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.805701 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.846598 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.877805 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.883523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.883591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.883609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.883630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.883658 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.906997 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.925088 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.949107 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.972925 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.986467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.986517 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.986531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.986551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.986565 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:35Z","lastTransitionTime":"2025-11-22T02:53:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:35 crc kubenswrapper[4922]: I1122 02:53:35.991941 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:35Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.008919 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.030165 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.046538 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.058049 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.079248 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.090085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.090130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.090144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.090166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.090182 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.096238 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.116630 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.131681 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.146853 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.193318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.193364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.193376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.193391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.193402 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.296316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.296366 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.296379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.296402 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.296419 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.299871 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:36 crc kubenswrapper[4922]: E1122 02:53:36.300077 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.399594 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.399692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.399715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.399749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.399771 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.503558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.503626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.503646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.503675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.503693 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.607739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.607815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.607832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.607886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.607905 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.710767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.710815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.710829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.710875 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.710894 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.792404 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/2.log" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.793607 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/1.log" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.801251 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581" exitCode=1 Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.801300 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.801342 4922 scope.go:117] "RemoveContainer" containerID="81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.802361 4922 scope.go:117] "RemoveContainer" containerID="34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581" Nov 22 02:53:36 crc kubenswrapper[4922]: E1122 02:53:36.802618 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.813164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.813269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.813290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.813324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.813345 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.832353 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.854073 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.890571 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.911636 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.917685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.917737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.917750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.917776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.917793 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:36Z","lastTransitionTime":"2025-11-22T02:53:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.940181 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.959457 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:36 crc kubenswrapper[4922]: I1122 02:53:36.981303 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.001509 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:36Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.016989 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.021896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.021962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.021980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.022017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.022037 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.055802 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae17a6779a8d3de7ab3da7eb5399094b7fde374d0b866b515c9768de61ed89\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"message\\\":\\\"where column _uuid == {6ea1fd71-2b40-4361-92ee-3f1ab4ec7414}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:53:22.196651 6374 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_TCP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53: 10.217.4.10:9154:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {be9dcc9e-c16a-4962-a6d2-4adeb0b929c4}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.077050 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.102197 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.119760 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.125692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.125756 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.125775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.125807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.125829 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.137604 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.157527 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.173269 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.194907 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.229013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.229101 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.229123 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.229155 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.229178 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.300125 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.300175 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.300180 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:37 crc kubenswrapper[4922]: E1122 02:53:37.300363 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:37 crc kubenswrapper[4922]: E1122 02:53:37.301270 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:37 crc kubenswrapper[4922]: E1122 02:53:37.302025 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.332466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.332532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.332546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.332570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.332585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.435957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.436027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.436042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.436064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.436081 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.472419 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.540406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.540474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.540492 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.540520 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.540546 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.643724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.643828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.643894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.643935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.643965 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.746947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.746987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.747067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.747103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.747112 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.808624 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/2.log" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.814756 4922 scope.go:117] "RemoveContainer" containerID="34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581" Nov 22 02:53:37 crc kubenswrapper[4922]: E1122 02:53:37.815490 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.837769 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.849818 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.849868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.849878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.849896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.849933 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.858258 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.890622 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.914482 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.935836 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.953358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.953431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.953450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.953484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.953508 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:37Z","lastTransitionTime":"2025-11-22T02:53:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.955731 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.976872 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:37 crc kubenswrapper[4922]: I1122 02:53:37.998237 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:37Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.013091 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.028664 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.054392 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.057181 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.057219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.057230 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.057250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.057263 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.073706 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.096944 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.112450 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.132424 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.153587 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.160899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.160966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.160983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.161013 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.161032 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.177607 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:38Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.264656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.264746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.264770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.264804 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.264831 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.300270 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:38 crc kubenswrapper[4922]: E1122 02:53:38.300484 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.368571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.368651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.368675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.368703 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.368725 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.472456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.472513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.472531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.472557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.472576 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.583654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.583725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.583746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.583776 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.583797 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.687272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.687356 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.687378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.687461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.687484 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.791191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.791265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.791278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.791299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.791314 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.810869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:38 crc kubenswrapper[4922]: E1122 02:53:38.811134 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:38 crc kubenswrapper[4922]: E1122 02:53:38.811265 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:53:54.81123412 +0000 UTC m=+70.849756052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.894924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.894995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.895008 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.895028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.895043 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.998620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.998693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.998712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.998741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:38 crc kubenswrapper[4922]: I1122 02:53:38.998761 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:38Z","lastTransitionTime":"2025-11-22T02:53:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.013525 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.013790 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:54:11.013741679 +0000 UTC m=+87.052263681 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.014081 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.014266 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.014374 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:11.014347593 +0000 UTC m=+87.052869515 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.102909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.103009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.103020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.103036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.103046 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.116167 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.116242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.116313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116416 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116445 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116459 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116500 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116528 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116530 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:11.116507888 +0000 UTC m=+87.155029900 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116545 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116562 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116627 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:11.11659609 +0000 UTC m=+87.155118022 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.116715 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:11.116672282 +0000 UTC m=+87.155194354 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.207104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.207190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.207203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.207221 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.207233 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.299739 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.299804 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.299739 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.299929 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.300190 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:39 crc kubenswrapper[4922]: E1122 02:53:39.300453 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.310247 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.310336 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.310657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.310704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.310727 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.415029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.415076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.415085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.415103 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.415115 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.517766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.517830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.517873 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.517896 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.517914 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.620994 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.621080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.621098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.621122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.621140 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.726044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.726555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.726807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.727079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.727311 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.830913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.830973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.830984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.831007 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.831021 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.934967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.935018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.935036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.935064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:39 crc kubenswrapper[4922]: I1122 02:53:39.935081 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:39Z","lastTransitionTime":"2025-11-22T02:53:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.038467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.038550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.038562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.038577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.038587 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.141765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.141816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.141833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.141888 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.141905 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.244259 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.244562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.244698 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.244792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.244916 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.299948 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:40 crc kubenswrapper[4922]: E1122 02:53:40.300549 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.350148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.350203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.350217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.350235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.350251 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.452623 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.452668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.452679 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.452695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.452707 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.556167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.556238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.556262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.556290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.556310 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.660652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.660764 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.660790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.660830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.660891 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.763839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.763931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.763947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.763970 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.763988 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.866264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.866346 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.866363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.866388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.866407 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.969736 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.969800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.969811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.969827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:40 crc kubenswrapper[4922]: I1122 02:53:40.969839 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:40Z","lastTransitionTime":"2025-11-22T02:53:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.062516 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.073883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.073953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.073972 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.074004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.074024 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.079792 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.087687 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.106841 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.139810 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.158421 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.178290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.178360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.178386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.178419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.178442 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.180908 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.198303 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.215430 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.235489 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.252236 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.270531 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.281459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.281621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.281701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.281795 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.281904 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.288319 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.299782 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.299923 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:41 crc kubenswrapper[4922]: E1122 02:53:41.299966 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.299922 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:41 crc kubenswrapper[4922]: E1122 02:53:41.300082 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:41 crc kubenswrapper[4922]: E1122 02:53:41.300275 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.304105 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.328759 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.345677 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.363303 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.376428 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.385093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.385152 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.385163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.385183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.385197 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.392169 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:41Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.488840 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.488921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.488936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.488961 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.488978 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.591687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.591798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.591822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.591885 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.591906 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.695051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.695117 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.695128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.695146 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.695158 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.799493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.799584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.799604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.799636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.799660 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.903052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.903136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.903156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.903187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:41 crc kubenswrapper[4922]: I1122 02:53:41.903209 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:41Z","lastTransitionTime":"2025-11-22T02:53:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.006781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.006905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.006924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.006953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.006974 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.109651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.109707 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.109724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.109749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.109766 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.213338 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.213426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.213450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.213480 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.213499 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.299932 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:42 crc kubenswrapper[4922]: E1122 02:53:42.300173 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.316330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.316398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.316416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.316439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.316456 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.420546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.420601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.420618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.420640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.420684 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.524519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.524600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.524620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.524651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.524671 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.628018 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.628097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.628118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.628148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.628171 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.731136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.731201 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.731215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.731237 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.731256 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.833453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.833518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.833536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.833560 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.833580 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.936686 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.936743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.936753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.936770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:42 crc kubenswrapper[4922]: I1122 02:53:42.936783 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:42Z","lastTransitionTime":"2025-11-22T02:53:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.040160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.040214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.040225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.040243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.040254 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.144092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.144166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.144187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.144216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.144236 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.248077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.248154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.248173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.248202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.248223 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.300456 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.300544 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:43 crc kubenswrapper[4922]: E1122 02:53:43.300663 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.300682 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:43 crc kubenswrapper[4922]: E1122 02:53:43.300932 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:43 crc kubenswrapper[4922]: E1122 02:53:43.301109 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.351073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.351186 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.351207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.351235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.351259 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.454908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.455002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.455022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.455053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.455073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.559064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.559128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.559147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.559191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.559211 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.662605 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.662693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.662719 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.662758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.662780 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.767407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.767494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.767692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.767725 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.767753 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.873491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.873592 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.873618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.873650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.873670 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.978303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.978423 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.978452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.978486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:43 crc kubenswrapper[4922]: I1122 02:53:43.978510 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:43Z","lastTransitionTime":"2025-11-22T02:53:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.082591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.082657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.082677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.082709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.082733 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.186115 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.186180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.186199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.186227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.186247 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.289360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.289426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.289445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.289469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.289708 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.299918 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:44 crc kubenswrapper[4922]: E1122 02:53:44.300151 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.393190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.393267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.393286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.393317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.393336 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.497710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.497783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.497800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.497827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.497917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.601376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.601432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.601444 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.601465 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.601479 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.705170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.705233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.705257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.705291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.705313 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.809794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.810062 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.810088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.810122 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.810149 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.867708 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.867788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.867809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.867903 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.867943 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: E1122 02:53:44.885768 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.892141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.892220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.892239 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.892271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.892292 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: E1122 02:53:44.914132 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.921408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.921490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.921511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.921542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.921564 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: E1122 02:53:44.944113 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.950238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.950312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.950331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.950361 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.950383 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:44 crc kubenswrapper[4922]: E1122 02:53:44.973319 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.980407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.980505 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.980526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.980553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:44 crc kubenswrapper[4922]: I1122 02:53:44.980571 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:44Z","lastTransitionTime":"2025-11-22T02:53:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: E1122 02:53:45.003214 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:44Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: E1122 02:53:45.003448 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.006431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.006485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.006503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.006532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.006554 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.109971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.110036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.110053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.110075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.110090 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.214387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.214451 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.214468 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.214494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.214514 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.300140 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:45 crc kubenswrapper[4922]: E1122 02:53:45.300370 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.300471 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:45 crc kubenswrapper[4922]: E1122 02:53:45.300736 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.300966 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:45 crc kubenswrapper[4922]: E1122 02:53:45.301160 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.317652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.317712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.317735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.317766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.317789 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.329319 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.355549 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.372020 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.396954 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.414645 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.421229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.421282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.421297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.421319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.421336 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.446046 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.470065 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.485381 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.509275 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.524984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.525035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.525047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.525066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.525078 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.531114 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.551261 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.566252 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.581018 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.597543 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.614246 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.627563 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.627829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.627900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.627913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.627933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.627945 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.642824 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.662198 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:45Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.731212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.731264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.731292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.731311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.731323 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.834962 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.835030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.835050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.835079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.835099 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.938824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.938934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.938952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.938980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:45 crc kubenswrapper[4922]: I1122 02:53:45.939000 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:45Z","lastTransitionTime":"2025-11-22T02:53:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.042399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.042476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.042496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.042528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.042549 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.145984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.146052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.146067 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.146089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.146106 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.248966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.249028 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.249042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.249066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.249082 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.299501 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:46 crc kubenswrapper[4922]: E1122 02:53:46.299696 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.351600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.351672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.351691 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.351716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.351738 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.455398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.455473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.455494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.455522 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.455542 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.558075 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.558148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.558167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.558194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.558214 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.661219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.661303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.661322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.661354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.661374 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.764469 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.764526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.764542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.764568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.764585 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.868419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.868493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.868511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.868536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.868556 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.972397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.972996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.973233 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.973434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:46 crc kubenswrapper[4922]: I1122 02:53:46.973655 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:46Z","lastTransitionTime":"2025-11-22T02:53:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.083441 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.083529 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.083551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.083585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.083608 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.187838 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.187973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.188001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.188037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.188103 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.291462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.291503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.291511 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.291531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.291540 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.299881 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:47 crc kubenswrapper[4922]: E1122 02:53:47.299984 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.300119 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.300221 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:47 crc kubenswrapper[4922]: E1122 02:53:47.300423 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:47 crc kubenswrapper[4922]: E1122 02:53:47.300567 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.394195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.394274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.394293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.394324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.394348 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.497657 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.497733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.497753 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.497783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.497805 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.601413 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.601516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.601545 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.601574 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.601598 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.704672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.704715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.704727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.704744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.704759 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.807734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.807781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.807800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.807823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.807841 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.911487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.911538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.911555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.911578 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:47 crc kubenswrapper[4922]: I1122 02:53:47.911599 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:47Z","lastTransitionTime":"2025-11-22T02:53:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.015557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.015647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.015674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.015709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.015738 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.119042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.119145 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.119170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.119205 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.119228 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.223009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.223052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.223063 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.223081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.223093 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.300154 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:48 crc kubenswrapper[4922]: E1122 02:53:48.300383 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.326015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.326086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.326106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.326134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.326156 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.428724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.428864 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.428878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.428894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.428906 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.532301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.532353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.532383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.532403 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.532415 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.635552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.635618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.635635 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.635910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.635939 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.738977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.739055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.739081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.739116 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.739138 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.842874 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.842934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.842947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.842974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.842989 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.946021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.946105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.946130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.946177 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:48 crc kubenswrapper[4922]: I1122 02:53:48.946201 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:48Z","lastTransitionTime":"2025-11-22T02:53:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.048561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.048611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.048621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.048637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.048649 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.151940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.152011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.152030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.152056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.152074 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.255494 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.255583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.255609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.255644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.255668 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.300549 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.300634 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.300549 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:49 crc kubenswrapper[4922]: E1122 02:53:49.300798 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:49 crc kubenswrapper[4922]: E1122 02:53:49.300939 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:49 crc kubenswrapper[4922]: E1122 02:53:49.301066 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.358729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.358817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.358883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.358980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.359008 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.462415 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.462923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.462949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.462991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.463015 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.565610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.565650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.565661 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.565682 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.565695 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.668767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.668824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.668835 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.668878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.668890 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.771743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.771786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.771799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.771819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.771833 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.874769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.874870 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.874893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.874913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.874925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.978747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.978839 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.978917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.978953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:49 crc kubenswrapper[4922]: I1122 02:53:49.978977 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:49Z","lastTransitionTime":"2025-11-22T02:53:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.082826 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.082908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.082920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.082945 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.082960 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.186751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.186820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.186837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.186878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.186897 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.290029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.290097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.290110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.290131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.290144 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.300488 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:50 crc kubenswrapper[4922]: E1122 02:53:50.300740 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.394926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.394989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.395006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.395038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.395069 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.498289 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.498382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.498408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.498446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.498469 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.601584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.601625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.601634 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.601651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.601665 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.704267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.704330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.704339 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.704355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.704365 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.807654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.807723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.807737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.807759 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.807775 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.910394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.910459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.910473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.910497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:50 crc kubenswrapper[4922]: I1122 02:53:50.910514 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:50Z","lastTransitionTime":"2025-11-22T02:53:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.013223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.013269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.013286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.013311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.013325 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.116723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.116778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.116788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.116806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.116816 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.220361 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.220682 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.220792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.220938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.221023 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.300085 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.300118 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:51 crc kubenswrapper[4922]: E1122 02:53:51.300275 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.300324 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:51 crc kubenswrapper[4922]: E1122 02:53:51.300533 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:51 crc kubenswrapper[4922]: E1122 02:53:51.300701 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.324407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.324450 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.324462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.324485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.324501 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.427166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.427215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.427258 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.427279 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.427292 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.531121 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.531178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.531191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.531216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.531230 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.634286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.634358 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.634380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.634408 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.634428 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.738065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.738129 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.738142 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.738164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.738182 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.841458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.841532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.841542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.841559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.841572 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.945715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.945769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.945780 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.945797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:51 crc kubenswrapper[4922]: I1122 02:53:51.945808 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:51Z","lastTransitionTime":"2025-11-22T02:53:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.050011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.050085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.050104 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.050144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.050187 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.152731 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.152800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.152820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.152874 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.152898 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.256491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.256553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.256570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.256597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.256613 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.300390 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:52 crc kubenswrapper[4922]: E1122 02:53:52.300667 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.301461 4922 scope.go:117] "RemoveContainer" containerID="34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581" Nov 22 02:53:52 crc kubenswrapper[4922]: E1122 02:53:52.301660 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.359674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.359751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.359765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.359784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.359794 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.462901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.462949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.462960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.462978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.462991 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.565822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.565889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.565900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.565921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.565936 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.669291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.669347 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.669365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.669391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.669413 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.772431 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.772484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.772500 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.772519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.772532 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.874816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.874915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.874935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.874963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.874986 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.978632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.978702 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.978721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.978751 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:52 crc kubenswrapper[4922]: I1122 02:53:52.978772 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:52Z","lastTransitionTime":"2025-11-22T02:53:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.081744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.081797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.081816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.081868 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.081888 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.184562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.184647 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.184667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.184694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.184715 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.287889 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.287953 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.287971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.288001 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.288020 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.300592 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.300700 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.300596 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:53 crc kubenswrapper[4922]: E1122 02:53:53.300902 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:53 crc kubenswrapper[4922]: E1122 02:53:53.301050 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:53 crc kubenswrapper[4922]: E1122 02:53:53.301251 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.390537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.390595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.390613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.390640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.390657 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.493595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.493640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.493653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.493674 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.493697 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.596580 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.596639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.596653 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.596678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.596695 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.699722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.699783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.699796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.699817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.699831 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.803499 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.803559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.803572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.803600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.803618 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.906406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.906459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.906473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.906493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:53 crc kubenswrapper[4922]: I1122 02:53:53.906508 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:53Z","lastTransitionTime":"2025-11-22T02:53:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.009497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.009540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.009555 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.009573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.009587 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.112932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.113053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.113071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.113094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.113110 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.215709 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.215755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.215765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.215783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.215793 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.300277 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:54 crc kubenswrapper[4922]: E1122 02:53:54.301049 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.318786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.318925 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.318949 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.318981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.319028 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.429616 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.429678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.429691 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.429714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.429730 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.534178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.534418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.534435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.534460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.534502 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.637094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.637172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.637202 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.637223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.637237 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.740827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.740935 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.740955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.740982 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.741004 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.844887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.845004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.845026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.845053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.845073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.891700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:54 crc kubenswrapper[4922]: E1122 02:53:54.892049 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:54 crc kubenswrapper[4922]: E1122 02:53:54.892361 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:54:26.892324644 +0000 UTC m=+102.930846566 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.949225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.949280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.949291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.949312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:54 crc kubenswrapper[4922]: I1122 02:53:54.949323 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:54Z","lastTransitionTime":"2025-11-22T02:53:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.052510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.052590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.052609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.052642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.052660 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.155636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.155685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.155697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.155717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.155729 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.259807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.259924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.259997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.260038 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.260063 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.299675 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.299901 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.299903 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.299937 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.300222 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.300401 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.314528 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.329818 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.342866 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.358560 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.362456 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.362508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.362526 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.362549 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.362568 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.371166 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.383478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.383568 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.383587 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.383611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.383630 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.386057 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.402585 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.404346 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.407463 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.407509 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.407521 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.407541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.407551 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.420699 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.423480 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.429193 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.429271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.429309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.429330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.429344 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.440766 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.444346 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.450195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.450240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.450251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.450272 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.450286 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.456549 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.465075 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.476799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.478483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.478504 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.478528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.478542 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.484274 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.494223 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: E1122 02:53:55.494440 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.496478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.496513 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.496527 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.496547 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.496560 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.501990 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.520605 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.534247 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.551469 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.567076 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.586754 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.599910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.600106 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.600212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.600310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.600404 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.611542 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:55Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.702938 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.703000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.703019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.703052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.703073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.805389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.805437 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.805448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.805464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.805475 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.908235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.908299 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.908313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.908334 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:55 crc kubenswrapper[4922]: I1122 02:53:55.908348 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:55Z","lastTransitionTime":"2025-11-22T02:53:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.010940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.010988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.010999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.011017 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.011028 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.113955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.113999 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.114009 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.114027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.114041 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.217148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.217187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.217199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.217214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.217227 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.300456 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:56 crc kubenswrapper[4922]: E1122 02:53:56.300664 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.319775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.319827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.319864 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.319886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.319899 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.422312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.422354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.422363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.422379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.422389 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.524985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.525041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.525050 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.525068 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.525079 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.629137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.629223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.629245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.629277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.629301 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.732819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.733160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.733180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.733207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.733228 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.836815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.836904 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.836924 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.836952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.836975 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.886645 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/0.log" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.886741 4922 generic.go:334] "Generic (PLEG): container finished" podID="954bb7b8-d710-4e1a-973e-78c04e685f30" containerID="f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd" exitCode=1 Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.886873 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerDied","Data":"f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.887564 4922 scope.go:117] "RemoveContainer" containerID="f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.909391 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.929756 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.941331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.941379 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.941393 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.941411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.941426 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:56Z","lastTransitionTime":"2025-11-22T02:53:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.948696 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.971766 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:56 crc kubenswrapper[4922]: I1122 02:53:56.991307 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:56Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.006317 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.026145 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.041781 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.044606 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.044713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.044772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.044948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.045759 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.075082 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.093458 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.113058 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.129525 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.148966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.149020 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.149037 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.149059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.149076 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.151046 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.166268 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.181503 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.198738 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.213862 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.234390 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.251796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.251831 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.251857 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.251874 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.251885 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.300511 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.300642 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.300940 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:57 crc kubenswrapper[4922]: E1122 02:53:57.301149 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:57 crc kubenswrapper[4922]: E1122 02:53:57.301360 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:57 crc kubenswrapper[4922]: E1122 02:53:57.301579 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.355217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.355321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.355350 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.355391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.355419 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.458510 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.458914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.458985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.459089 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.459161 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.562829 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.562955 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.562983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.563021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.563048 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.667596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.667654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.667677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.667705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.667728 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.772285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.772367 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.772392 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.772427 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.772457 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.874734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.874800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.874817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.874881 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.874902 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.893655 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/0.log" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.893734 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerStarted","Data":"00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.918528 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.935684 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.953118 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.979554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.980024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.980222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.980373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.980518 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:57Z","lastTransitionTime":"2025-11-22T02:53:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.982885 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:57 crc kubenswrapper[4922]: I1122 02:53:57.997515 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:57Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.010919 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.022530 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.035252 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.049228 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.066080 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.081064 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.083256 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.083365 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.083483 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.083571 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.083644 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.093786 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.123177 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.150333 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.165432 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.179463 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.190531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.190792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.190914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.191024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.191133 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.196825 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.209422 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:53:58Z is after 2025-08-24T17:21:41Z" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.294355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.294436 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.294457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.294488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.294507 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.300482 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:53:58 crc kubenswrapper[4922]: E1122 02:53:58.300623 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.398548 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.398609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.398622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.398641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.398657 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.502229 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.502316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.502337 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.502362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.502381 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.605940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.606000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.606010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.606033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.606047 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.708803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.708906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.708926 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.708952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.708974 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.812344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.812416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.812435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.812461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.812480 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.915654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.915727 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.915752 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.915789 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:58 crc kubenswrapper[4922]: I1122 02:53:58.915886 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:58Z","lastTransitionTime":"2025-11-22T02:53:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.019684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.019738 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.019749 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.019770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.019782 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.123786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.123892 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.123914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.123977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.123999 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.227487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.227565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.227586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.227618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.227639 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.300279 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.300406 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.300310 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:53:59 crc kubenswrapper[4922]: E1122 02:53:59.300579 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:53:59 crc kubenswrapper[4922]: E1122 02:53:59.300821 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:53:59 crc kubenswrapper[4922]: E1122 02:53:59.301079 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.331389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.331473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.331497 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.331531 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.331556 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.435636 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.435704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.435722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.435750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.435771 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.540019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.540082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.540100 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.540124 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.540186 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.643768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.643830 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.643874 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.643902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.643920 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.746597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.746656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.746676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.746701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.746722 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.851187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.851288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.851319 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.851363 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.851405 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.954783 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.954827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.954837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.954872 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:53:59 crc kubenswrapper[4922]: I1122 02:53:59.954886 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:53:59Z","lastTransitionTime":"2025-11-22T02:53:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.058244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.058300 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.058312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.058331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.058343 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.161503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.161559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.161572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.161591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.161603 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.264667 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.264739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.264758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.264786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.264803 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.299721 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:00 crc kubenswrapper[4922]: E1122 02:54:00.299961 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.368640 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.368729 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.368747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.368777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.368801 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.471991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.472078 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.472097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.472127 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.472148 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.576044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.576132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.576151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.576182 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.576203 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.680886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.680981 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.680997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.681024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.681692 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.786414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.786470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.786484 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.786508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.786522 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.890704 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.890768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.890782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.890803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.890816 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.994705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.994778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.994797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.994831 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:00 crc kubenswrapper[4922]: I1122 02:54:00.994882 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:00Z","lastTransitionTime":"2025-11-22T02:54:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.098887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.098967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.098992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.099022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.099042 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.202687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.202760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.202777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.202811 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.202832 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.300590 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.300719 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:01 crc kubenswrapper[4922]: E1122 02:54:01.300907 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.300962 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:01 crc kubenswrapper[4922]: E1122 02:54:01.301192 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:01 crc kubenswrapper[4922]: E1122 02:54:01.301295 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.305728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.305806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.305821 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.305862 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.305878 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.408785 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.408883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.408897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.408920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.408935 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.512375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.512452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.512474 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.512503 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.512522 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.615931 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.616027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.616051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.616085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.616104 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.719922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.720012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.720053 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.720090 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.720112 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.823933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.824039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.824066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.824105 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.824127 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.928226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.928300 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.928326 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.928360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:01 crc kubenswrapper[4922]: I1122 02:54:01.928382 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:01Z","lastTransitionTime":"2025-11-22T02:54:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.032768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.032894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.032919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.032957 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.032984 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.136905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.136978 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.136997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.137025 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.137048 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.241660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.241737 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.241757 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.241785 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.241808 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.300279 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:02 crc kubenswrapper[4922]: E1122 02:54:02.300526 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.345114 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.345170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.345189 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.345219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.345239 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.448462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.448508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.448523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.448543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.448557 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.551255 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.551316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.551329 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.551354 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.551369 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.653929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.653993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.654010 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.654035 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.654054 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.757119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.757197 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.757215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.757242 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.757259 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.861601 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.861658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.861670 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.861690 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.861705 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.965287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.965375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.965395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.965428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:02 crc kubenswrapper[4922]: I1122 02:54:02.965450 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:02Z","lastTransitionTime":"2025-11-22T02:54:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.069183 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.069256 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.069274 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.069305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.069324 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.172400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.172445 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.172461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.172486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.172504 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.275721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.275796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.275815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.275882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.275903 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.299718 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.299799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.299917 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:03 crc kubenswrapper[4922]: E1122 02:54:03.299988 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:03 crc kubenswrapper[4922]: E1122 02:54:03.300166 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:03 crc kubenswrapper[4922]: E1122 02:54:03.300409 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.379990 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.380051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.380064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.380084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.380094 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.484694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.484790 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.484816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.484890 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.484925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.588654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.588695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.588706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.588726 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.588766 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.691832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.691983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.692000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.692024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.692046 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.795394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.795447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.795464 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.795486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.795503 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.899081 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.899147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.899167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.899195 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:03 crc kubenswrapper[4922]: I1122 02:54:03.899215 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:03Z","lastTransitionTime":"2025-11-22T02:54:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.002305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.002376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.002400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.002429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.002455 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.105960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.106015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.106032 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.106056 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.106073 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.208992 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.209034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.209043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.209077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.209090 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.300218 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:04 crc kubenswrapper[4922]: E1122 02:54:04.300431 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.314694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.314764 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.314782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.314806 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.314829 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.418586 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.418654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.418672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.418730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.418750 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.521913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.521971 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.521984 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.522003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.522016 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.625170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.625238 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.625263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.625294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.625318 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.729031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.729099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.729123 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.729154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.729179 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.832342 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.832377 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.832388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.832406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.832420 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.936021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.936085 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.936108 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.936137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:04 crc kubenswrapper[4922]: I1122 02:54:04.936161 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:04Z","lastTransitionTime":"2025-11-22T02:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.039280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.039328 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.039340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.039356 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.039369 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.142920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.143314 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.143331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.143351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.143361 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.247198 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.247251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.247260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.247278 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.247288 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.300172 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.300224 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.300339 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.300353 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.300465 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.300531 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.319645 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.332680 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.350394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.350453 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.350470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.350493 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.350510 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.352183 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.367615 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.388898 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.412949 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.431205 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.450198 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.453206 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.453234 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.453245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.453264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.453276 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.477289 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.500036 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.518404 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.540174 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.555687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.555735 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.555747 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.555767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.555780 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.555893 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.583231 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.602107 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.619653 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.650538 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.658226 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.658537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.658634 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.658715 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.658785 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.676483 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.762111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.762538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.762665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.762825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.762992 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.867049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.867149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.867167 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.867191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.867232 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.876320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.876589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.876822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.877093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.877274 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.896966 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.904036 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.904143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.904166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.904831 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.904917 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.924978 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.930418 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.930470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.930487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.930514 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.930531 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.949260 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.953786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.953831 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.953879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.953901 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.953919 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:05 crc kubenswrapper[4922]: E1122 02:54:05.972580 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.979416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.979488 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.979552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.979585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:05 crc kubenswrapper[4922]: I1122 02:54:05.979602 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:05Z","lastTransitionTime":"2025-11-22T02:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: E1122 02:54:06.000666 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:05Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:06 crc kubenswrapper[4922]: E1122 02:54:06.000959 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.003366 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.003419 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.003437 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.003462 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.003481 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.107414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.107528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.107550 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.107642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.107663 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.211172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.211223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.211243 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.211262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.211274 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.299623 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:06 crc kubenswrapper[4922]: E1122 02:54:06.299911 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.301447 4922 scope.go:117] "RemoveContainer" containerID="34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.314396 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.314617 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.314744 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.314915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.315070 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.418563 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.418611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.418625 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.418642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.418651 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.522585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.522654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.522676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.522708 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.522732 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.626297 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.626362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.626384 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.626411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.626430 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.728732 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.728801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.728820 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.728876 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.728895 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.831914 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.831997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.832012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.832051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.832065 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.931364 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/2.log" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.933600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.933648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.933663 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.933684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.933697 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:06Z","lastTransitionTime":"2025-11-22T02:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.934476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.935042 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.967587 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.982029 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:06 crc kubenswrapper[4922]: I1122 02:54:06.995564 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:06Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.006636 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.024163 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.036676 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.036733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.036746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.036765 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.036778 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.039566 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.052959 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.066763 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.081273 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.102654 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.118956 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.140227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.140294 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.140311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.140335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.140352 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.143301 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.169133 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.183079 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.195386 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.206961 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.222283 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.235272 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.243027 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.243071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.243082 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.243097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.243107 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.300465 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:07 crc kubenswrapper[4922]: E1122 02:54:07.300650 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.300757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.300836 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:07 crc kubenswrapper[4922]: E1122 02:54:07.301045 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:07 crc kubenswrapper[4922]: E1122 02:54:07.301171 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.346378 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.346428 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.346439 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.346461 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.346487 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.449491 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.449540 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.449552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.449574 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.449588 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.553051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.553125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.553144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.553173 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.553198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.655902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.655966 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.655983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.656012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.656032 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.758638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.758675 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.758687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.758705 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.758719 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.862174 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.862246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.862266 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.862291 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.862309 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.940962 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/3.log" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.941827 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/2.log" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.946082 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" exitCode=1 Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.946153 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.946224 4922 scope.go:117] "RemoveContainer" containerID="34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.947275 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 02:54:07 crc kubenswrapper[4922]: E1122 02:54:07.947636 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.965336 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.965394 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.965412 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.965438 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.965458 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:07Z","lastTransitionTime":"2025-11-22T02:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:07 crc kubenswrapper[4922]: I1122 02:54:07.968472 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:07Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.003310 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.029379 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.054572 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.068409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.068714 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.068939 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.069126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.069276 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.074263 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.101950 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.124958 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.149120 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.171236 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.172467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.172891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.173077 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.173245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.173447 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.200643 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://34d1b1289ea6a27a227e433d072bfe6e350bbb997ddbc88ceb665eaa37794581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:36Z\\\",\\\"message\\\":\\\"lbConfig(nil)\\\\nI1122 02:53:36.370505 6577 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 02:53:36.370516 6577 services_controller.go:451] Built service openshift-ingress-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1122 02:53:36.370543 6577 services_controller.go:452] Built service openshift-ingress-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1122 02:53:36.370578 6577 services_controller.go:453] Built service openshift-ingress-operator/metrics template LB for network=default: []services.LB{}\\\\nF1122 02:53:36.370585 6577 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:07Z\\\",\\\"message\\\":\\\"rotocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:54:07.381785 6974 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-operators]} name:Service_openshift-marketplace/redhat-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.138:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97419c58-41c7-41d7-a137-a446f0c7eeb3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1122 02:54:07.381682 6974 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.222344 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.239118 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.255970 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.274942 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.277786 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.277867 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.277887 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.277915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.277933 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.295250 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.299494 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:08 crc kubenswrapper[4922]: E1122 02:54:08.299677 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.313988 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.328315 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.341563 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.380825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.380929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.380947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.380974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.380995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.484217 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.484538 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.484710 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.484967 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.485149 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.588694 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.588733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.588743 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.588760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.588773 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.691910 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.691991 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.692015 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.692047 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.692070 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.795208 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.795360 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.795389 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.795420 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.795443 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.899232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.899303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.899325 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.899353 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.899375 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:08Z","lastTransitionTime":"2025-11-22T02:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.953730 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/3.log" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.959618 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 02:54:08 crc kubenswrapper[4922]: E1122 02:54:08.959967 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:54:08 crc kubenswrapper[4922]: I1122 02:54:08.979397 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:08Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.002989 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.003065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.003087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.003110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.003127 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.019517 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.044155 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.068086 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.087173 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.106650 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.106711 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.106733 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.106767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.106789 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.114180 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.135106 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.157473 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.175520 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.210912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.210983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.211002 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.211030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.211052 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.230936 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:07Z\\\",\\\"message\\\":\\\"rotocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:54:07.381785 6974 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-operators]} name:Service_openshift-marketplace/redhat-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.138:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97419c58-41c7-41d7-a137-a446f0c7eeb3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1122 02:54:07.381682 6974 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.264981 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.277008 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.287737 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.300165 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.300257 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:09 crc kubenswrapper[4922]: E1122 02:54:09.300388 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.300426 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.300484 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: E1122 02:54:09.300989 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:09 crc kubenswrapper[4922]: E1122 02:54:09.301065 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.313534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.313602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.313615 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.313631 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.313645 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.315676 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.331038 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.349709 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.365784 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:09Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.416349 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.416414 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.416432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.416457 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.416474 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.524758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.525164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.525292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.525460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.525590 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.628656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.628937 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.629030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.629170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.629252 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.732561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.732621 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.732638 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.732664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.732683 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.836117 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.836646 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.836879 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.837070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.837224 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.940460 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.940519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.940536 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.940562 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:09 crc kubenswrapper[4922]: I1122 02:54:09.940579 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:09Z","lastTransitionTime":"2025-11-22T02:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.044073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.044143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.044166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.044191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.044210 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.147561 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.147626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.147641 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.147665 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.147681 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.251024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.251110 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.251130 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.251154 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.251175 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.300307 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:10 crc kubenswrapper[4922]: E1122 02:54:10.300560 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.355128 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.355194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.355213 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.355239 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.355260 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.459111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.459246 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.459271 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.459295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.459314 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.563120 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.563622 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.563812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.564034 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.564228 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.667701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.667746 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.667760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.667779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.667793 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.770552 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.770603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.770614 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.770632 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.770646 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.874156 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.874240 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.874264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.874295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.874318 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.977116 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.977172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.977190 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.977212 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:10 crc kubenswrapper[4922]: I1122 02:54:10.977229 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:10Z","lastTransitionTime":"2025-11-22T02:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.066948 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.067138 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.067172 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.067142225 +0000 UTC m=+151.105664147 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.067263 4922 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.067340 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.06732091 +0000 UTC m=+151.105842832 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.081095 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.081164 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.081188 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.081222 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.081247 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.168689 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.168769 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.168811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.168969 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169029 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169053 4922 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169073 4922 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169139 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169177 4922 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169199 4922 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169150 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.169118841 +0000 UTC m=+151.207640773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169312 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.169284355 +0000 UTC m=+151.207806287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.169347 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.169335667 +0000 UTC m=+151.207857589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.184713 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.184781 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.184798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.184817 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.184828 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.289235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.289305 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.289330 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.289362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.289388 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.300470 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.300559 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.300731 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.300822 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.300946 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:11 crc kubenswrapper[4922]: E1122 02:54:11.301279 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.392840 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.392940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.392956 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.392980 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.392999 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.496141 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.496335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.496364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.496399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.496428 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.599827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.599936 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.599958 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.599985 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.600003 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.704011 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.704092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.704112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.704138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.704157 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.808359 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.808466 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.808485 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.808543 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.808562 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.912932 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.913029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.913057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.913091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:11 crc kubenswrapper[4922]: I1122 02:54:11.913114 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:11Z","lastTransitionTime":"2025-11-22T02:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.017803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.017912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.017934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.017963 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.017985 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.120977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.121039 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.121059 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.121086 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.121106 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.224458 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.224544 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.224569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.224602 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.224629 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.299773 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:12 crc kubenswrapper[4922]: E1122 02:54:12.300203 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.316349 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.327750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.327922 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.327997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.328029 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.328052 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.431673 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.431755 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.431778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.431803 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.431821 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.535584 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.535649 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.535668 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.535692 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.535710 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.639144 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.639214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.639232 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.639257 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.639276 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.744209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.744298 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.744323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.744407 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.744428 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.848546 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.848637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.848666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.848701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.848727 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.951920 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.952003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.952022 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.952048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:12 crc kubenswrapper[4922]: I1122 02:54:12.952067 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:12Z","lastTransitionTime":"2025-11-22T02:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.056084 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.056170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.056196 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.056225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.056250 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.159395 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.159447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.159459 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.159476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.159487 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.263231 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.263290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.263307 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.263362 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.263380 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.299786 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.299892 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.299922 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:13 crc kubenswrapper[4922]: E1122 02:54:13.300055 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:13 crc kubenswrapper[4922]: E1122 02:54:13.300244 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:13 crc kubenswrapper[4922]: E1122 02:54:13.300382 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.366219 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.366283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.366302 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.366331 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.366352 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.470269 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.470323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.470335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.470352 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.470365 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.574398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.574801 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.575024 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.575225 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.575403 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.679322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.679778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.680021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.680171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.680296 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.784282 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.784429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.784454 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.784486 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.784508 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.888642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.888734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.888762 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.888802 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.888884 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.992391 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.992471 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.992489 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.992516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:13 crc kubenswrapper[4922]: I1122 02:54:13.992536 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:13Z","lastTransitionTime":"2025-11-22T02:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.096052 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.096131 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.096151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.096210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.096234 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.199685 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.199754 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.199772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.199798 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.199873 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.300647 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:14 crc kubenswrapper[4922]: E1122 02:54:14.300905 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.302996 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.303057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.303069 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.303087 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.303100 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.406782 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.406919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.406943 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.406975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.406995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.511799 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.511899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.511919 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.511946 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.511965 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.615929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.616003 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.616060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.616098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.616123 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.720684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.721160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.721321 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.721473 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.721626 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.825809 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.826323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.826528 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.826684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.826819 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.930364 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.930429 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.930447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.930475 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:14 crc kubenswrapper[4922]: I1122 02:54:14.930494 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:14Z","lastTransitionTime":"2025-11-22T02:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.033772 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.033929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.033948 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.033974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.033992 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.136816 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.136883 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.136897 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.136912 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.136925 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.239467 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.239535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.239554 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.239577 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.239594 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.299640 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.299666 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.299837 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:15 crc kubenswrapper[4922]: E1122 02:54:15.300018 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:15 crc kubenswrapper[4922]: E1122 02:54:15.300117 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:15 crc kubenswrapper[4922]: E1122 02:54:15.300327 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.318982 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c8000a-a783-474f-a73a-55814c257a02\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8pk97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:23Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2gmkj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.339200 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0925b79-c7d7-4e10-a883-978c4c4f4aca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9273d5ec12281398617de471c700390678e37a0f25a8f419af589821bbbf82cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88b7c6dc11edfeb5cd97d044eba3210a471efebefaf19e244347e856860544e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa47eebd3d997c52252492bb21d0357a5bcda89c3a47713b4d55c5a4e2117ba5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a43e447ea1ef6069640894c9f4a841797e8ea7a34d6df8d76f9958755e54d5b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.342341 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.342376 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.342390 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.342411 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.342430 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.361285 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.380791 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.398502 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.416603 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8f0f91018d8ffcc35550e1e567a256d60f9cef7fb07a7c3c21d393e6ff5653f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.438952 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d4gbc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"954bb7b8-d710-4e1a-973e-78c04e685f30\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:53:56Z\\\",\\\"message\\\":\\\"2025-11-22T02:53:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa\\\\n2025-11-22T02:53:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_13809bfd-4d69-4693-89f6-87ce51a7b8aa to /host/opt/cni/bin/\\\\n2025-11-22T02:53:11Z [verbose] multus-daemon started\\\\n2025-11-22T02:53:11Z [verbose] Readiness Indicator file check\\\\n2025-11-22T02:53:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52z2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d4gbc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.444523 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.444551 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.444559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.444572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.444582 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.463615 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-df5ll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b5672d56-8abd-4aa4-ac8d-1655896397f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62a64d211a2e298509a2677a0d64bf8205f1a62e952ee786f34571d4c1e962b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klcgp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:11Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-df5ll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.486196 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8552c75b-ff86-4366-ba7d-31f454e4c4d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://095da4ba1e99f9564ae3e7c95e0319b9b03a655b824f73a40f378292bd16a613\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c36d97db0f554e895c9a66ab9d98a846dab9ea7fdc6fb1694a1e6ba6e8e03caa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5806c02c9ab10a6ae71658c5c26a959a0c1af4dc116eebb10456e842e18bc265\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e578ec32c48afdd4b1beffcba79284416d5d388027f807e195d552777eb3ff0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f7d8908fd03ef5111feb365e6882679f026d46d0b404e585b0fb248b3ffccc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b67b302d03380e704d83a5fbe0960ef261d09d30f88115099e6e99bc1ddc6134\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a60e64f0c6da14c06e35942ee22aa2471d53c43386af495a39a388f89842ffd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f89893174d64975b9d4c7d8069012e55850fc36e30c6ca7cf04b0455a4805e82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.503516 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912136b8-9119-4585-8a8a-fffffe458b82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1364d5e8967ee57d3fe7c3ca85053dbeee65956324a6ff45f53339ee6a380e59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1b62ea65c78bb4f19f5fdb2ebed6da4b5554981906058d4d33472e53eecebe7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aa36ff8bd81008d118d746d10130faac5df3ffdd243309f12b2444acf8e0ac8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fe422173d9c9619e9ccbf14fa6b45cd384fb3b32973aa7c32b7597d23818ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1ba37f0d76415fadbf5fe5c3e7b0d0ac1e04923deb445e3ffebaa45f9a3284d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T02:52:51Z\\\",\\\"message\\\":\\\"W1122 02:52:50.548479 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 02:52:50.548907 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763779970 cert, and key in /tmp/serving-cert-1581887030/serving-signer.crt, /tmp/serving-cert-1581887030/serving-signer.key\\\\nI1122 02:52:50.747646 1 observer_polling.go:159] Starting file observer\\\\nW1122 02:52:50.754705 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 02:52:50.754978 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 02:52:50.756448 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1581887030/tls.crt::/tmp/serving-cert-1581887030/tls.key\\\\\\\"\\\\nF1122 02:52:51.071644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:50Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://892025699a91526b7be50d611c0c47a78d080b10f5001683ba5b6614e1168759\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:49Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c056e8a2a4abd507614f7e530775df37dc62009606ac486c6e7d6e8e03a5caf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.523333 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c46a871bcc893c29dd8f87750ec890ac870f3f59615c5951f473288a19b460c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.536948 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ntq2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087141fd-fc9b-4685-bada-20260ee96369\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb4eafebbd378af8579bc424a11b51eeda38398db994f81c408b9bead685713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4fck\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ntq2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.547534 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.547573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.547585 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.547604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.547617 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.553107 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n664h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"826eb92e-c839-4fba-9737-3b52101fec88\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bccfd5290a47556c0ac2088ad8d34ab89870aec1d03311431a50dd857a43e5dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://308e322f3f66a16146261590b5ff893e87fbab44e02d36e8fa1ca4eea9902721\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b0836380bd0e2bde9a53e1ac824e90dc0f857c1c9abcbb693c5404bbcdd2954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4354a4f35999aa1d4656642a8ee40f2eb9888de296105b581d1c9991183d19f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b68614c913a89b066e823911e4aec2e4af744d1dfe2b572f9580bb19b1409f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d34227e6d938885aa05a020d4ffd8c2b35b4df65dc41de85bf3e775aa024d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0125b85a532a9445c19a61f7bdd47ccd48ee48f067df5c041c6bf98daae9630\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nws4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n664h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.566948 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da6031ac-2dd7-43c9-b9e2-4c6394e23c1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c2863c3dce12e19530e9ec5d083788e8426a5f34a325c4a8b5dcbb29b454ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a6ba782e23e8c6a485ec4c29d9adefe33028022db04c1a66d20e49a73c40a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f7cbt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-v8z92\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.582197 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a41b27f5-5ded-4523-b6b1-e7e81ea9fd0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f48e23eebd528cf7cc530b29a032d3bec8eebaa3a5fcaff1296124b9d23bc96a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd5ceb43d7f9e20cb867d08cabc1d1d37744308ef18248947c728071eb2bc47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd5ceb43d7f9e20cb867d08cabc1d1d37744308ef18248947c728071eb2bc47e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:52:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.599836 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a581af85a69caa65abe5f7766482dd0b71eaf201d43e354b8316ff616c865193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f2d349f9466eb418a64dabae95c139bf23e943909a37b6b651c90e85ff55bb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.614071 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e5537a6-a41f-446d-ab40-ec8c7a3edc90\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:52:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaf643c85775379271a26c718e36d87319342e6797522f91726ce4084925f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://205de8f1aeca28c7877b83da70915cd216631f5d7c28b0c9f3bb748a8f98547e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7d163b34cfb34723c07cfa4592a134291185548d4294cb6fcd1d5d8ab63617\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9608708be558bf83d1156f620534bd4c98b8f4fb7f96c972be8dc67524f4546c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:52:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:52:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.626997 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"402683b1-a29f-4a79-a36c-daf6e8068d0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cf0de2e7bcabf2c1beae2a37caaddb2de2365480082bd95a7843816ec8e3ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zpl9p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-b9j6n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.648820 4922 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T02:54:07Z\\\",\\\"message\\\":\\\"rotocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.119:443: 10.217.5.119:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d4efc4a8-c514-4a6b-901c-2953978b50d3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1122 02:54:07.381785 6974 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/redhat-operators]} name:Service_openshift-marketplace/redhat-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.138:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {97419c58-41c7-41d7-a137-a446f0c7eeb3}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1122 02:54:07.381682 6974 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T02:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T02:53:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T02:53:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T02:53:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T02:53:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c7wvg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:15Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.650652 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.650706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.650720 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.650740 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.650754 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.754355 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.754409 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.754422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.754446 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.754464 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.857285 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.857345 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.857361 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.857387 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.857404 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.960769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.961048 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.961134 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.961264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:15 crc kubenswrapper[4922]: I1122 02:54:15.961364 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:15Z","lastTransitionTime":"2025-11-22T02:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.065191 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.065250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.065267 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.065288 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.065305 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.169261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.169639 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.169778 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.169983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.170121 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.178030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.178098 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.178124 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.178153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.178175 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.201362 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.208906 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.209126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.209265 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.209397 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.209518 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.232245 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.238987 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.239125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.239209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.239988 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.240038 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.260770 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.267165 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.267218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.267235 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.267260 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.267279 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.291126 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.299771 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.299807 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.299819 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.299837 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.299871 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.299919 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.300135 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.316830 4922 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T02:54:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e949e6da-04e3-4086-acd2-53671701d8c1\\\",\\\"systemUUID\\\":\\\"7e87d562-50ca-4b7a-8b5f-35220f4abd2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T02:54:16Z is after 2025-08-24T17:21:41Z" Nov 22 02:54:16 crc kubenswrapper[4922]: E1122 02:54:16.317140 4922 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.319565 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.319610 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.319627 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.319648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.319662 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.422619 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.422659 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.422672 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.422688 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.422700 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.527245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.527421 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.527472 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.527496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.527513 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.631057 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.631125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.631149 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.631180 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.631204 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.734187 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.734245 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.734263 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.734286 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.734304 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.837519 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.837613 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.837642 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.837678 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.837702 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.940827 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.940907 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.940923 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.940947 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:16 crc kubenswrapper[4922]: I1122 02:54:16.940963 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:16Z","lastTransitionTime":"2025-11-22T02:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.044382 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.044426 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.044435 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.044448 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.044460 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.148162 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.148262 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.148283 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.148310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.148328 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.251005 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.251049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.251060 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.251080 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.251094 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.300004 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:17 crc kubenswrapper[4922]: E1122 02:54:17.300205 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.300022 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:17 crc kubenswrapper[4922]: E1122 02:54:17.300375 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.300225 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:17 crc kubenswrapper[4922]: E1122 02:54:17.300496 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.354761 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.354812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.354832 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.354915 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.354935 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.458833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.458929 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.458944 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.458969 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.458985 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.562597 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.562651 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.562664 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.562684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.562701 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.665779 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.665863 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.665878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.665902 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.665920 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.770079 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.770151 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.770174 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.770207 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.770231 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.875066 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.875132 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.875153 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.875184 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.875204 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.978660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.978724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.978739 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.978768 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:17 crc kubenswrapper[4922]: I1122 02:54:17.978787 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:17Z","lastTransitionTime":"2025-11-22T02:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.082654 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.082721 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.082741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.082766 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.082783 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.186516 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.186600 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.186620 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.186648 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.186669 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.289899 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.290021 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.290043 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.290070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.290090 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.299952 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:18 crc kubenswrapper[4922]: E1122 02:54:18.300232 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.393215 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.393309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.393336 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.393373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.393400 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.497051 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.497118 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.497143 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.497171 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.497193 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.601570 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.601660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.601684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.601717 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.601740 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.705894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.705975 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.705998 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.706030 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.706057 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.810019 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.810099 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.810113 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.810135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.810149 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.913940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.914023 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.914041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.914070 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:18 crc kubenswrapper[4922]: I1122 02:54:18.914087 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:18Z","lastTransitionTime":"2025-11-22T02:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.017148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.017209 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.017227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.017250 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.017265 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.120741 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.120823 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.120894 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.120940 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.120966 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.224934 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.225006 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.225033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.225065 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.225089 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.300225 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.300302 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.300259 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:19 crc kubenswrapper[4922]: E1122 02:54:19.300498 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:19 crc kubenswrapper[4922]: E1122 02:54:19.300669 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:19 crc kubenswrapper[4922]: E1122 02:54:19.300929 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.330318 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.330404 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.330434 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.330470 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.330527 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.433695 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.433773 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.433794 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.433822 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.433884 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.537012 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.537091 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.537111 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.537138 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.537158 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.640083 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.640148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.640170 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.640199 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.640225 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.744178 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.744261 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.744287 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.744322 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.744347 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.847900 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.847960 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.847973 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.847997 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.848013 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.951223 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.951293 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.951312 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.951343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:19 crc kubenswrapper[4922]: I1122 02:54:19.951363 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:19Z","lastTransitionTime":"2025-11-22T02:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.054905 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.054983 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.055000 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.055033 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.055055 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.159147 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.159227 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.159244 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.159277 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.159298 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.263310 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.263422 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.263447 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.263478 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.263501 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.299728 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:20 crc kubenswrapper[4922]: E1122 02:54:20.300013 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.367609 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.367681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.367701 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.367728 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.367747 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.470777 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.470878 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.470898 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.470925 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.470948 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.575016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.575073 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.575093 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.575117 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.575136 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.677812 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.677891 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.677909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.677933 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.677952 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.782214 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.782276 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.782295 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.782324 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.782344 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.886479 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.886541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.886553 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.886575 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.886593 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.989959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.990031 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.990049 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.990071 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:20 crc kubenswrapper[4922]: I1122 02:54:20.990091 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:20Z","lastTransitionTime":"2025-11-22T02:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.094788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.094921 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.094951 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.094993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.095024 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.199042 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.199112 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.199136 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.199172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.199198 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.299496 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.299610 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:21 crc kubenswrapper[4922]: E1122 02:54:21.299806 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.299823 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:21 crc kubenswrapper[4922]: E1122 02:54:21.300121 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:21 crc kubenswrapper[4922]: E1122 02:54:21.301234 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.301915 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.302535 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.302595 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.302722 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.302750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.302770 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: E1122 02:54:21.303256 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.406220 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.406292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.406311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.406335 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.406354 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.510308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.510399 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.510424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.510455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.510481 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.618596 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.618697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.618723 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.618763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.618799 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.722977 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.723044 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.723055 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.723076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.723089 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.826787 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.826895 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.826917 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.826942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.826962 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.930687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.930773 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.930796 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.930824 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:21 crc kubenswrapper[4922]: I1122 02:54:21.930882 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:21Z","lastTransitionTime":"2025-11-22T02:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.033825 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.033911 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.033928 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.033952 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.033971 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.137280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.137351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.137373 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.137398 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.137417 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.241300 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.241767 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.241788 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.241815 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.241833 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.300601 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:22 crc kubenswrapper[4922]: E1122 02:54:22.300900 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.345303 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.345370 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.345383 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.345406 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.345425 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.448317 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.448380 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.448400 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.448424 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.448442 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.552251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.552309 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.552323 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.552344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.552357 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.656004 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.656102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.656126 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.656163 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.656190 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.760677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.760769 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.760792 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.760828 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.760887 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.865203 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.865264 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.865284 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.865308 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.865330 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.969135 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.969194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.969216 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.969247 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:22 crc kubenswrapper[4922]: I1122 02:54:22.969271 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:22Z","lastTransitionTime":"2025-11-22T02:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.072760 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.072834 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.072882 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.072909 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.072931 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.176251 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.176316 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.176340 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.176370 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.176390 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.279041 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.279094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.279109 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.279133 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.279147 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.299794 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.300113 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.299961 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:23 crc kubenswrapper[4922]: E1122 02:54:23.300656 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:23 crc kubenswrapper[4922]: E1122 02:54:23.300939 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:23 crc kubenswrapper[4922]: E1122 02:54:23.301115 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.382016 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.382102 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.382125 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.382160 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.382183 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.486965 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.487045 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.487064 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.487097 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.487120 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.590463 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.590518 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.590532 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.590557 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.590571 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.694344 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.694432 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.694455 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.694487 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.694511 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.798301 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.798369 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.798388 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.798416 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.798434 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.901908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.901959 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.901974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.901993 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:23 crc kubenswrapper[4922]: I1122 02:54:23.902007 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:23Z","lastTransitionTime":"2025-11-22T02:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.005537 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.005591 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.005604 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.005626 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.005638 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.108476 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.108542 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.108559 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.108582 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.108599 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.212496 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.212564 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.212583 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.212611 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.212631 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.299805 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:24 crc kubenswrapper[4922]: E1122 02:54:24.300273 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.317166 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.317280 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.317313 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.317351 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.317379 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.421724 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.422507 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.422658 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.422893 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.423089 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.526628 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.527026 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.527188 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.527343 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.527509 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.631758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.632716 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.632913 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.633088 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.633222 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.737218 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.737290 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.737311 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.737375 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.737396 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.841094 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.841572 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.841800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.842092 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.842311 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.945599 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.945660 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.945684 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.945712 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:24 crc kubenswrapper[4922]: I1122 02:54:24.945737 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:24Z","lastTransitionTime":"2025-11-22T02:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.048693 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.048758 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.048775 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.048800 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.048819 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.152833 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.152942 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.152964 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.152995 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.153020 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.257644 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.257731 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.257750 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.257784 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.257804 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.299626 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.299715 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:25 crc kubenswrapper[4922]: E1122 02:54:25.299914 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.300063 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:25 crc kubenswrapper[4922]: E1122 02:54:25.300305 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:25 crc kubenswrapper[4922]: E1122 02:54:25.300469 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.334215 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.334186026 podStartE2EDuration="1m17.334186026s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.334164435 +0000 UTC m=+101.372686387" watchObservedRunningTime="2025-11-22 02:54:25.334186026 +0000 UTC m=+101.372707948" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.354902 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podStartSLOduration=77.354829025 podStartE2EDuration="1m17.354829025s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.354144688 +0000 UTC m=+101.392666620" watchObservedRunningTime="2025-11-22 02:54:25.354829025 +0000 UTC m=+101.393350957" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.360590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.360838 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.361508 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.361770 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.361984 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.437124 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d4gbc" podStartSLOduration=77.437093615 podStartE2EDuration="1m17.437093615s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.436800928 +0000 UTC m=+101.475322820" watchObservedRunningTime="2025-11-22 02:54:25.437093615 +0000 UTC m=+101.475615527" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.454472 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-df5ll" podStartSLOduration=77.454449554 podStartE2EDuration="1m17.454449554s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.454153718 +0000 UTC m=+101.492675610" watchObservedRunningTime="2025-11-22 02:54:25.454449554 +0000 UTC m=+101.492971456" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.464148 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.464386 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.464541 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.464683 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.464818 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.501071 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.50100409 podStartE2EDuration="44.50100409s" podCreationTimestamp="2025-11-22 02:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.498800878 +0000 UTC m=+101.537322810" watchObservedRunningTime="2025-11-22 02:54:25.50100409 +0000 UTC m=+101.539526022" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.568490 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.568569 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.568590 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.568618 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.568637 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.592325 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n664h" podStartSLOduration=77.592296749 podStartE2EDuration="1m17.592296749s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.591027078 +0000 UTC m=+101.629549030" watchObservedRunningTime="2025-11-22 02:54:25.592296749 +0000 UTC m=+101.630818661" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.650199 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-v8z92" podStartSLOduration=77.650178879 podStartE2EDuration="1m17.650178879s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.612581859 +0000 UTC m=+101.651103761" watchObservedRunningTime="2025-11-22 02:54:25.650178879 +0000 UTC m=+101.688700771" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.650369 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.650366123 podStartE2EDuration="1m16.650366123s" podCreationTimestamp="2025-11-22 02:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.648446627 +0000 UTC m=+101.686968529" watchObservedRunningTime="2025-11-22 02:54:25.650366123 +0000 UTC m=+101.688888015" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.671365 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.67132707 podStartE2EDuration="1m18.67132707s" podCreationTimestamp="2025-11-22 02:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.66967872 +0000 UTC m=+101.708200622" watchObservedRunningTime="2025-11-22 02:54:25.67132707 +0000 UTC m=+101.709849002" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.671452 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.671706 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.671734 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.671763 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.671787 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.704332 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ntq2p" podStartSLOduration=77.704303898 podStartE2EDuration="1m17.704303898s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.70274466 +0000 UTC m=+101.741266562" watchObservedRunningTime="2025-11-22 02:54:25.704303898 +0000 UTC m=+101.742825800" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.715271 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.715240552000001 podStartE2EDuration="13.715240552s" podCreationTimestamp="2025-11-22 02:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:25.715032727 +0000 UTC m=+101.753554669" watchObservedRunningTime="2025-11-22 02:54:25.715240552 +0000 UTC m=+101.753762474" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.775076 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.775137 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.775150 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.775176 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.775192 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.877797 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.877886 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.877908 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.877974 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.877995 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.981573 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.981630 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.981645 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.981666 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:25 crc kubenswrapper[4922]: I1122 02:54:25.981680 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:25Z","lastTransitionTime":"2025-11-22T02:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.085172 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.085249 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.085273 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.085300 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.085319 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.189603 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.189677 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.189697 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.189730 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.189754 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.292194 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.292495 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.292589 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.292681 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.292766 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.299721 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:26 crc kubenswrapper[4922]: E1122 02:54:26.299951 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.396204 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.396275 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.396292 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.396320 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.396339 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.500119 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.500192 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.500210 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.500242 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.500266 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.510558 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.510637 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.510656 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.510687 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.510705 4922 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T02:54:26Z","lastTransitionTime":"2025-11-22T02:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.592603 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4"] Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.593383 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.598389 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.598529 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.598913 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.599529 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.771315 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.771389 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.771473 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.771570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.771603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872174 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872267 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872331 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872398 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872363 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.872790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.873929 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.882244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.897552 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e532ce3-fba4-4b1e-adbf-3b261f2da2c3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbkc4\" (UID: \"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.921638 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" Nov 22 02:54:26 crc kubenswrapper[4922]: I1122 02:54:26.973561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:26 crc kubenswrapper[4922]: E1122 02:54:26.973926 4922 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:26 crc kubenswrapper[4922]: E1122 02:54:26.974034 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs podName:d5c8000a-a783-474f-a73a-55814c257a02 nodeName:}" failed. No retries permitted until 2025-11-22 02:55:30.974006607 +0000 UTC m=+167.012528539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs") pod "network-metrics-daemon-2gmkj" (UID: "d5c8000a-a783-474f-a73a-55814c257a02") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 02:54:27 crc kubenswrapper[4922]: I1122 02:54:27.034464 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" event={"ID":"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3","Type":"ContainerStarted","Data":"f5556f722a71aaa63e66c774ab5e2cfbecc97b44eb96583b3d61e85e35341af7"} Nov 22 02:54:27 crc kubenswrapper[4922]: I1122 02:54:27.299928 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:27 crc kubenswrapper[4922]: E1122 02:54:27.300165 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:27 crc kubenswrapper[4922]: I1122 02:54:27.300260 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:27 crc kubenswrapper[4922]: E1122 02:54:27.300343 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:27 crc kubenswrapper[4922]: I1122 02:54:27.300525 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:27 crc kubenswrapper[4922]: E1122 02:54:27.300739 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:28 crc kubenswrapper[4922]: I1122 02:54:28.040216 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" event={"ID":"8e532ce3-fba4-4b1e-adbf-3b261f2da2c3","Type":"ContainerStarted","Data":"fa4518bc73c809e236db14ba8eb54c56459161adeae176ae84aaeac20296963e"} Nov 22 02:54:28 crc kubenswrapper[4922]: I1122 02:54:28.061597 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbkc4" podStartSLOduration=80.061577162 podStartE2EDuration="1m20.061577162s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:54:28.061513601 +0000 UTC m=+104.100035533" watchObservedRunningTime="2025-11-22 02:54:28.061577162 +0000 UTC m=+104.100099054" Nov 22 02:54:28 crc kubenswrapper[4922]: I1122 02:54:28.300547 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:28 crc kubenswrapper[4922]: E1122 02:54:28.300727 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:29 crc kubenswrapper[4922]: I1122 02:54:29.299683 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:29 crc kubenswrapper[4922]: I1122 02:54:29.299687 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:29 crc kubenswrapper[4922]: E1122 02:54:29.300042 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:29 crc kubenswrapper[4922]: E1122 02:54:29.300203 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:29 crc kubenswrapper[4922]: I1122 02:54:29.300265 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:29 crc kubenswrapper[4922]: E1122 02:54:29.300387 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:30 crc kubenswrapper[4922]: I1122 02:54:30.300225 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:30 crc kubenswrapper[4922]: E1122 02:54:30.300461 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:31 crc kubenswrapper[4922]: I1122 02:54:31.299828 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:31 crc kubenswrapper[4922]: I1122 02:54:31.299828 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:31 crc kubenswrapper[4922]: E1122 02:54:31.300043 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:31 crc kubenswrapper[4922]: I1122 02:54:31.300183 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:31 crc kubenswrapper[4922]: E1122 02:54:31.300323 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:31 crc kubenswrapper[4922]: E1122 02:54:31.300614 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:32 crc kubenswrapper[4922]: I1122 02:54:32.300103 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:32 crc kubenswrapper[4922]: E1122 02:54:32.300729 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:33 crc kubenswrapper[4922]: I1122 02:54:33.532530 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:33 crc kubenswrapper[4922]: E1122 02:54:33.532669 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:33 crc kubenswrapper[4922]: I1122 02:54:33.532820 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:33 crc kubenswrapper[4922]: E1122 02:54:33.532929 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:33 crc kubenswrapper[4922]: I1122 02:54:33.533156 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:33 crc kubenswrapper[4922]: E1122 02:54:33.533485 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:33 crc kubenswrapper[4922]: I1122 02:54:33.533765 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:33 crc kubenswrapper[4922]: E1122 02:54:33.534049 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:34 crc kubenswrapper[4922]: I1122 02:54:34.301222 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 02:54:34 crc kubenswrapper[4922]: E1122 02:54:34.301387 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:54:35 crc kubenswrapper[4922]: I1122 02:54:35.299602 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:35 crc kubenswrapper[4922]: I1122 02:54:35.299667 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:35 crc kubenswrapper[4922]: I1122 02:54:35.299743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:35 crc kubenswrapper[4922]: I1122 02:54:35.299743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:35 crc kubenswrapper[4922]: E1122 02:54:35.301914 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:35 crc kubenswrapper[4922]: E1122 02:54:35.302056 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:35 crc kubenswrapper[4922]: E1122 02:54:35.302277 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:35 crc kubenswrapper[4922]: E1122 02:54:35.302418 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:37 crc kubenswrapper[4922]: I1122 02:54:37.300315 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:37 crc kubenswrapper[4922]: I1122 02:54:37.301062 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:37 crc kubenswrapper[4922]: E1122 02:54:37.301259 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:37 crc kubenswrapper[4922]: I1122 02:54:37.301356 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:37 crc kubenswrapper[4922]: I1122 02:54:37.301407 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:37 crc kubenswrapper[4922]: E1122 02:54:37.301549 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:37 crc kubenswrapper[4922]: E1122 02:54:37.301714 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:37 crc kubenswrapper[4922]: E1122 02:54:37.301899 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:39 crc kubenswrapper[4922]: I1122 02:54:39.300397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:39 crc kubenswrapper[4922]: I1122 02:54:39.300444 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:39 crc kubenswrapper[4922]: I1122 02:54:39.300522 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:39 crc kubenswrapper[4922]: E1122 02:54:39.300786 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:39 crc kubenswrapper[4922]: I1122 02:54:39.300824 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:39 crc kubenswrapper[4922]: E1122 02:54:39.301022 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:39 crc kubenswrapper[4922]: E1122 02:54:39.301164 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:39 crc kubenswrapper[4922]: E1122 02:54:39.301258 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:41 crc kubenswrapper[4922]: I1122 02:54:41.300556 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:41 crc kubenswrapper[4922]: I1122 02:54:41.300735 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:41 crc kubenswrapper[4922]: I1122 02:54:41.300801 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:41 crc kubenswrapper[4922]: I1122 02:54:41.301157 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:41 crc kubenswrapper[4922]: E1122 02:54:41.301116 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:41 crc kubenswrapper[4922]: E1122 02:54:41.301356 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:41 crc kubenswrapper[4922]: E1122 02:54:41.301589 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:41 crc kubenswrapper[4922]: E1122 02:54:41.301972 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.099064 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/1.log" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.100408 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/0.log" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.100525 4922 generic.go:334] "Generic (PLEG): container finished" podID="954bb7b8-d710-4e1a-973e-78c04e685f30" containerID="00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5" exitCode=1 Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.100598 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerDied","Data":"00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5"} Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.100668 4922 scope.go:117] "RemoveContainer" containerID="f8705bc3652c9a3f838ffdc44bb26f87689d9eca09105427a3dc25b730967ccd" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.101494 4922 scope.go:117] "RemoveContainer" containerID="00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5" Nov 22 02:54:43 crc kubenswrapper[4922]: E1122 02:54:43.101928 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-d4gbc_openshift-multus(954bb7b8-d710-4e1a-973e-78c04e685f30)\"" pod="openshift-multus/multus-d4gbc" podUID="954bb7b8-d710-4e1a-973e-78c04e685f30" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.299655 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.299729 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.299655 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:43 crc kubenswrapper[4922]: E1122 02:54:43.299896 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:43 crc kubenswrapper[4922]: E1122 02:54:43.299974 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:43 crc kubenswrapper[4922]: E1122 02:54:43.300062 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:43 crc kubenswrapper[4922]: I1122 02:54:43.300086 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:43 crc kubenswrapper[4922]: E1122 02:54:43.300189 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:44 crc kubenswrapper[4922]: I1122 02:54:44.107277 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/1.log" Nov 22 02:54:45 crc kubenswrapper[4922]: E1122 02:54:45.261278 4922 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 22 02:54:45 crc kubenswrapper[4922]: I1122 02:54:45.299478 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:45 crc kubenswrapper[4922]: I1122 02:54:45.300882 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:45 crc kubenswrapper[4922]: E1122 02:54:45.300886 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:45 crc kubenswrapper[4922]: I1122 02:54:45.300925 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:45 crc kubenswrapper[4922]: I1122 02:54:45.301007 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:45 crc kubenswrapper[4922]: E1122 02:54:45.301070 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:45 crc kubenswrapper[4922]: E1122 02:54:45.301016 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:45 crc kubenswrapper[4922]: E1122 02:54:45.301242 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:45 crc kubenswrapper[4922]: E1122 02:54:45.461390 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 02:54:46 crc kubenswrapper[4922]: I1122 02:54:46.301783 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 02:54:46 crc kubenswrapper[4922]: E1122 02:54:46.302156 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-c7wvg_openshift-ovn-kubernetes(c2a6bcd8-bb13-463b-b112-0df3cf90b5f7)\"" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" Nov 22 02:54:47 crc kubenswrapper[4922]: I1122 02:54:47.300165 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:47 crc kubenswrapper[4922]: I1122 02:54:47.300196 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:47 crc kubenswrapper[4922]: I1122 02:54:47.300229 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:47 crc kubenswrapper[4922]: E1122 02:54:47.300437 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:47 crc kubenswrapper[4922]: E1122 02:54:47.300509 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:47 crc kubenswrapper[4922]: I1122 02:54:47.300239 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:47 crc kubenswrapper[4922]: E1122 02:54:47.300934 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:47 crc kubenswrapper[4922]: E1122 02:54:47.301063 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:49 crc kubenswrapper[4922]: I1122 02:54:49.300621 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:49 crc kubenswrapper[4922]: I1122 02:54:49.300756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:49 crc kubenswrapper[4922]: I1122 02:54:49.300832 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:49 crc kubenswrapper[4922]: E1122 02:54:49.300820 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:49 crc kubenswrapper[4922]: I1122 02:54:49.300756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:49 crc kubenswrapper[4922]: E1122 02:54:49.301057 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:49 crc kubenswrapper[4922]: E1122 02:54:49.301286 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:49 crc kubenswrapper[4922]: E1122 02:54:49.301479 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:50 crc kubenswrapper[4922]: E1122 02:54:50.463270 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 02:54:51 crc kubenswrapper[4922]: I1122 02:54:51.300567 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:51 crc kubenswrapper[4922]: I1122 02:54:51.300698 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:51 crc kubenswrapper[4922]: I1122 02:54:51.300764 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:51 crc kubenswrapper[4922]: E1122 02:54:51.300787 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:51 crc kubenswrapper[4922]: I1122 02:54:51.300586 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:51 crc kubenswrapper[4922]: E1122 02:54:51.300957 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:51 crc kubenswrapper[4922]: E1122 02:54:51.301089 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:51 crc kubenswrapper[4922]: E1122 02:54:51.301147 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:53 crc kubenswrapper[4922]: I1122 02:54:53.300424 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:53 crc kubenswrapper[4922]: I1122 02:54:53.300459 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:53 crc kubenswrapper[4922]: I1122 02:54:53.300529 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:53 crc kubenswrapper[4922]: E1122 02:54:53.300611 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:53 crc kubenswrapper[4922]: E1122 02:54:53.300808 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:53 crc kubenswrapper[4922]: I1122 02:54:53.300892 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:53 crc kubenswrapper[4922]: E1122 02:54:53.301083 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:53 crc kubenswrapper[4922]: E1122 02:54:53.301199 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:55 crc kubenswrapper[4922]: I1122 02:54:55.300693 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:55 crc kubenswrapper[4922]: I1122 02:54:55.300716 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:55 crc kubenswrapper[4922]: I1122 02:54:55.300756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:55 crc kubenswrapper[4922]: I1122 02:54:55.302575 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:55 crc kubenswrapper[4922]: E1122 02:54:55.302785 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:55 crc kubenswrapper[4922]: E1122 02:54:55.302860 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:55 crc kubenswrapper[4922]: E1122 02:54:55.302944 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:55 crc kubenswrapper[4922]: E1122 02:54:55.303023 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:55 crc kubenswrapper[4922]: I1122 02:54:55.303144 4922 scope.go:117] "RemoveContainer" containerID="00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5" Nov 22 02:54:55 crc kubenswrapper[4922]: E1122 02:54:55.464589 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 02:54:56 crc kubenswrapper[4922]: I1122 02:54:56.159428 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/1.log" Nov 22 02:54:56 crc kubenswrapper[4922]: I1122 02:54:56.159507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerStarted","Data":"48c067afb41b86f3c7a21f1573ef0c9a87523ea9d2bc5bd76723c492a588b7a7"} Nov 22 02:54:57 crc kubenswrapper[4922]: I1122 02:54:57.300292 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:57 crc kubenswrapper[4922]: I1122 02:54:57.300292 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:57 crc kubenswrapper[4922]: I1122 02:54:57.300444 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:57 crc kubenswrapper[4922]: E1122 02:54:57.301348 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:57 crc kubenswrapper[4922]: E1122 02:54:57.301466 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:57 crc kubenswrapper[4922]: I1122 02:54:57.300641 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:57 crc kubenswrapper[4922]: E1122 02:54:57.301607 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:57 crc kubenswrapper[4922]: E1122 02:54:57.301890 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:59 crc kubenswrapper[4922]: I1122 02:54:59.299880 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:54:59 crc kubenswrapper[4922]: I1122 02:54:59.299958 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:54:59 crc kubenswrapper[4922]: I1122 02:54:59.300061 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:54:59 crc kubenswrapper[4922]: I1122 02:54:59.300164 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:54:59 crc kubenswrapper[4922]: E1122 02:54:59.300256 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:54:59 crc kubenswrapper[4922]: E1122 02:54:59.300384 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:54:59 crc kubenswrapper[4922]: E1122 02:54:59.300515 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:54:59 crc kubenswrapper[4922]: E1122 02:54:59.301049 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:54:59 crc kubenswrapper[4922]: I1122 02:54:59.301275 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 02:55:00 crc kubenswrapper[4922]: I1122 02:55:00.178075 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/3.log" Nov 22 02:55:00 crc kubenswrapper[4922]: I1122 02:55:00.182325 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerStarted","Data":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} Nov 22 02:55:00 crc kubenswrapper[4922]: I1122 02:55:00.182925 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:55:00 crc kubenswrapper[4922]: I1122 02:55:00.220321 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podStartSLOduration=112.220297854 podStartE2EDuration="1m52.220297854s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:00.219294471 +0000 UTC m=+136.257816373" watchObservedRunningTime="2025-11-22 02:55:00.220297854 +0000 UTC m=+136.258819746" Nov 22 02:55:00 crc kubenswrapper[4922]: I1122 02:55:00.230224 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2gmkj"] Nov 22 02:55:00 crc kubenswrapper[4922]: I1122 02:55:00.230343 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:00 crc kubenswrapper[4922]: E1122 02:55:00.230450 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:55:00 crc kubenswrapper[4922]: E1122 02:55:00.467810 4922 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 02:55:01 crc kubenswrapper[4922]: I1122 02:55:01.299567 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:01 crc kubenswrapper[4922]: I1122 02:55:01.299743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:01 crc kubenswrapper[4922]: I1122 02:55:01.299883 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:01 crc kubenswrapper[4922]: E1122 02:55:01.299882 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:01 crc kubenswrapper[4922]: E1122 02:55:01.300022 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:01 crc kubenswrapper[4922]: I1122 02:55:01.300121 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:01 crc kubenswrapper[4922]: E1122 02:55:01.300181 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:01 crc kubenswrapper[4922]: E1122 02:55:01.300249 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:55:03 crc kubenswrapper[4922]: I1122 02:55:03.299541 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:03 crc kubenswrapper[4922]: E1122 02:55:03.299697 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:03 crc kubenswrapper[4922]: I1122 02:55:03.299931 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:03 crc kubenswrapper[4922]: E1122 02:55:03.299985 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:55:03 crc kubenswrapper[4922]: I1122 02:55:03.300089 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:03 crc kubenswrapper[4922]: E1122 02:55:03.300129 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:03 crc kubenswrapper[4922]: I1122 02:55:03.300255 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:03 crc kubenswrapper[4922]: E1122 02:55:03.300328 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:05 crc kubenswrapper[4922]: I1122 02:55:05.300621 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:05 crc kubenswrapper[4922]: I1122 02:55:05.300751 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:05 crc kubenswrapper[4922]: I1122 02:55:05.301501 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:05 crc kubenswrapper[4922]: I1122 02:55:05.302518 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:05 crc kubenswrapper[4922]: E1122 02:55:05.302520 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 02:55:05 crc kubenswrapper[4922]: E1122 02:55:05.302600 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2gmkj" podUID="d5c8000a-a783-474f-a73a-55814c257a02" Nov 22 02:55:05 crc kubenswrapper[4922]: E1122 02:55:05.302710 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 02:55:05 crc kubenswrapper[4922]: E1122 02:55:05.302431 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.300249 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.300382 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.300564 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.300782 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.302904 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.303288 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.303476 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.303564 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.303617 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.303696 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.339372 4922 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.386294 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mmrpx"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.387071 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.388224 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsfk7"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.388967 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.389149 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.390005 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qqgzd"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.390645 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.402813 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.403143 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.403285 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.403377 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.403517 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.405016 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.405099 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.405457 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.406616 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7zzh7"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.406811 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.406820 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.408016 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.412753 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.413401 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.413898 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.415165 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.416955 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.417943 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5v9d"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.448234 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.449143 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.449967 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.450047 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.450146 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.450158 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.450298 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.450305 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.451356 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.457687 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.460127 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.460208 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.461676 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.462006 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.462015 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.462461 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.462578 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.465552 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.465672 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.465810 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.465907 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466062 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466120 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466170 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466309 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466386 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466413 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466561 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466318 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466936 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466983 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.467009 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.467214 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.466355 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.465558 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.467374 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.467426 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.467491 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.467895 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.468512 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mwb28"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.469282 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.469683 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.470301 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.472465 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.473183 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.474264 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.482533 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.482945 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483193 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483275 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483344 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483410 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483427 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483306 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483497 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.483915 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.484022 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-56bkw"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.489733 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.491573 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.494770 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.495614 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.497040 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.497713 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.500952 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.502768 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.505050 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.511253 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.511597 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.512285 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.543925 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.544154 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.544942 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.544962 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.545085 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.545206 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.545279 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.545507 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.545552 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.546298 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.547225 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.547349 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.547878 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.548163 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.548441 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.548542 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.548649 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.548829 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.548999 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.549700 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.549837 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lns87"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.550433 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-n88n6"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.550685 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5hhlh"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.551634 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.551776 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.551831 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a96d295d-cf93-4a96-ac7d-cc85ed8221da-node-pullsecrets\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.551882 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.551906 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.551915 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.552616 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.552661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v849m\" (UniqueName: \"kubernetes.io/projected/9109383b-40f8-49d7-a601-1d048c4d8686-kube-api-access-v849m\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.552699 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a96d295d-cf93-4a96-ac7d-cc85ed8221da-audit-dir\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.552716 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.552717 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553086 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crppb\" (UniqueName: \"kubernetes.io/projected/3a67fd92-1760-46af-a0b4-0a52c965c63e-kube-api-access-crppb\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553147 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9507632c-3232-44dc-a75f-1275a2f57145-serving-cert\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-config\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553209 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553244 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5z6q\" (UniqueName: \"kubernetes.io/projected/b6ef9de0-f249-438a-94c0-0a359bd88889-kube-api-access-t5z6q\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553302 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-config\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553322 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-config\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553346 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553363 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-etcd-client\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553384 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553442 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8wj\" (UniqueName: \"kubernetes.io/projected/5289b0a6-6c1f-4fb4-972f-552899994896-kube-api-access-cg8wj\") pod \"cluster-samples-operator-665b6dd947-p6ktw\" (UID: \"5289b0a6-6c1f-4fb4-972f-552899994896\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553484 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5289b0a6-6c1f-4fb4-972f-552899994896-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6ktw\" (UID: \"5289b0a6-6c1f-4fb4-972f-552899994896\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553513 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553543 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9109383b-40f8-49d7-a601-1d048c4d8686-config\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-serving-cert\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553592 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ef9de0-f249-438a-94c0-0a359bd88889-config\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a67fd92-1760-46af-a0b4-0a52c965c63e-trusted-ca\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553635 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-client-ca\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553654 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-dir\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-auth-proxy-config\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553699 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1844c05f-b2a6-4abc-b4db-89223a5e6d60-serving-cert\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-policies\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553741 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-image-import-ca\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553767 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9109383b-40f8-49d7-a601-1d048c4d8686-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553517 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-machine-approver-tls\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553664 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553942 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-encryption-config\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553697 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553981 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a67fd92-1760-46af-a0b4-0a52c965c63e-config\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.553786 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554051 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554083 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sl6c\" (UniqueName: \"kubernetes.io/projected/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-kube-api-access-2sl6c\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554101 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554125 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9109383b-40f8-49d7-a601-1d048c4d8686-images\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554143 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6ef9de0-f249-438a-94c0-0a359bd88889-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554164 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a67fd92-1760-46af-a0b4-0a52c965c63e-serving-cert\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxl2\" (UniqueName: \"kubernetes.io/projected/f267ec6b-64da-4065-b3aa-2e66ac957118-kube-api-access-pxxl2\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554226 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbj65\" (UniqueName: \"kubernetes.io/projected/9507632c-3232-44dc-a75f-1275a2f57145-kube-api-access-fbj65\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554246 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzch\" (UniqueName: \"kubernetes.io/projected/a96d295d-cf93-4a96-ac7d-cc85ed8221da-kube-api-access-mfzch\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554297 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554338 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-config\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554358 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlq74\" (UniqueName: \"kubernetes.io/projected/1844c05f-b2a6-4abc-b4db-89223a5e6d60-kube-api-access-zlq74\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-audit\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.554455 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-etcd-serving-ca\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.556415 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.556660 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.556826 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.557305 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.557424 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.557650 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.558370 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.559315 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dj4jp"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.559998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.560457 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.560860 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.563260 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.564143 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.565052 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j2nbv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.565737 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.567418 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.568280 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rz2q"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.568424 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.569304 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.569420 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.569752 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.570391 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.570474 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.570774 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.570780 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.573536 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.576456 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.578805 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.580021 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.581537 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.584538 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.585159 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.593186 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.594072 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jb5md"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.597375 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.597757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.600973 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.601259 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.604871 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.605430 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.610271 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.610991 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.611439 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hjlwb"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.612470 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.616694 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cm895"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.617416 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.617724 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.618658 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.619106 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.619429 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.620992 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.622047 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.622573 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.622716 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gjqfl"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.623730 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.625523 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.626541 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wxkjb"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.626651 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.628112 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.628249 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.629267 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsfk7"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.629374 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.629468 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5v9d"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.630647 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7zzh7"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.631671 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qqgzd"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.632895 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.633889 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mwb28"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.635563 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lns87"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.636667 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.637649 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.638876 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.639716 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5hhlh"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.640708 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.642728 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dj4jp"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.645140 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mmrpx"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.646438 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-n88n6"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.648321 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-56bkw"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.650386 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.650719 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hjlwb"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.651729 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.652815 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.653772 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.654976 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzch\" (UniqueName: \"kubernetes.io/projected/a96d295d-cf93-4a96-ac7d-cc85ed8221da-kube-api-access-mfzch\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655002 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655029 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-config\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655047 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655071 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655089 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vct\" (UniqueName: \"kubernetes.io/projected/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-kube-api-access-c9vct\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655112 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlq74\" (UniqueName: \"kubernetes.io/projected/1844c05f-b2a6-4abc-b4db-89223a5e6d60-kube-api-access-zlq74\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655133 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-audit\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655151 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-etcd-serving-ca\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655170 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655190 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a96d295d-cf93-4a96-ac7d-cc85ed8221da-node-pullsecrets\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655355 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655453 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655512 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v849m\" (UniqueName: \"kubernetes.io/projected/9109383b-40f8-49d7-a601-1d048c4d8686-kube-api-access-v849m\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a96d295d-cf93-4a96-ac7d-cc85ed8221da-audit-dir\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655670 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.655772 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656493 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crppb\" (UniqueName: \"kubernetes.io/projected/3a67fd92-1760-46af-a0b4-0a52c965c63e-kube-api-access-crppb\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656525 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9507632c-3232-44dc-a75f-1275a2f57145-serving-cert\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656556 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-config\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656575 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-audit\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656590 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656697 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5z6q\" (UniqueName: \"kubernetes.io/projected/b6ef9de0-f249-438a-94c0-0a359bd88889-kube-api-access-t5z6q\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656735 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-metrics-tls\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656737 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656780 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-config\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656804 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-config\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656855 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656880 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-etcd-client\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656887 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-etcd-serving-ca\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656907 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8wj\" (UniqueName: \"kubernetes.io/projected/5289b0a6-6c1f-4fb4-972f-552899994896-kube-api-access-cg8wj\") pod \"cluster-samples-operator-665b6dd947-p6ktw\" (UID: \"5289b0a6-6c1f-4fb4-972f-552899994896\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657074 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5289b0a6-6c1f-4fb4-972f-552899994896-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6ktw\" (UID: \"5289b0a6-6c1f-4fb4-972f-552899994896\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657119 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9109383b-40f8-49d7-a601-1d048c4d8686-config\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657136 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-serving-cert\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657154 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ef9de0-f249-438a-94c0-0a359bd88889-config\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657174 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a67fd92-1760-46af-a0b4-0a52c965c63e-trusted-ca\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-trusted-ca\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657229 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-client-ca\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657253 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-dir\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657288 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-auth-proxy-config\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657312 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1844c05f-b2a6-4abc-b4db-89223a5e6d60-serving-cert\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657330 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-policies\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657348 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-image-import-ca\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657366 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9109383b-40f8-49d7-a601-1d048c4d8686-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657389 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657422 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-machine-approver-tls\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-encryption-config\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657459 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a67fd92-1760-46af-a0b4-0a52c965c63e-config\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657493 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657514 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sl6c\" (UniqueName: \"kubernetes.io/projected/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-kube-api-access-2sl6c\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657515 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-config\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657610 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9109383b-40f8-49d7-a601-1d048c4d8686-images\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657646 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6ef9de0-f249-438a-94c0-0a359bd88889-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a67fd92-1760-46af-a0b4-0a52c965c63e-serving-cert\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657703 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxxl2\" (UniqueName: \"kubernetes.io/projected/f267ec6b-64da-4065-b3aa-2e66ac957118-kube-api-access-pxxl2\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.657731 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbj65\" (UniqueName: \"kubernetes.io/projected/9507632c-3232-44dc-a75f-1275a2f57145-kube-api-access-fbj65\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.658058 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.658154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a96d295d-cf93-4a96-ac7d-cc85ed8221da-audit-dir\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.658496 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-policies\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.659047 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9109383b-40f8-49d7-a601-1d048c4d8686-images\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.659892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6ef9de0-f249-438a-94c0-0a359bd88889-config\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.660287 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wxkjb"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.660382 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.660405 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.661137 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-dir\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.662031 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-config\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.662367 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-image-import-ca\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.662603 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-serving-cert\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.662636 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-client-ca\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.662685 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-auth-proxy-config\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.663084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d295d-cf93-4a96-ac7d-cc85ed8221da-config\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.663134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.664965 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.665235 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6ef9de0-f249-438a-94c0-0a359bd88889-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.665434 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9109383b-40f8-49d7-a601-1d048c4d8686-config\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.666151 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.666396 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.666433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a67fd92-1760-46af-a0b4-0a52c965c63e-trusted-ca\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.666486 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.666514 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-encryption-config\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.666964 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.667246 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.667401 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5289b0a6-6c1f-4fb4-972f-552899994896-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-p6ktw\" (UID: \"5289b0a6-6c1f-4fb4-972f-552899994896\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.667919 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a67fd92-1760-46af-a0b4-0a52c965c63e-config\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.668001 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1844c05f-b2a6-4abc-b4db-89223a5e6d60-serving-cert\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656261 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.656228 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a96d295d-cf93-4a96-ac7d-cc85ed8221da-node-pullsecrets\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.668150 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.668970 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1844c05f-b2a6-4abc-b4db-89223a5e6d60-config\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.669128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a67fd92-1760-46af-a0b4-0a52c965c63e-serving-cert\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.669570 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.669977 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.670177 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.671976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9507632c-3232-44dc-a75f-1275a2f57145-serving-cert\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.672363 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-machine-approver-tls\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.675058 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.675512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.675629 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a96d295d-cf93-4a96-ac7d-cc85ed8221da-etcd-client\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.677931 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9109383b-40f8-49d7-a601-1d048c4d8686-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.678383 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.679869 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.681050 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j2nbv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.682278 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-s8lrv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.682521 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.683317 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.684870 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.686146 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4kqxp"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.687805 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.687857 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.689831 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.692165 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.693298 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cm895"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.694460 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.695798 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4kqxp"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.696979 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rz2q"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.698254 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.699592 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gjqfl"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.701010 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7bmdv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.701778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.701863 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.702061 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7bmdv"] Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.721855 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.742260 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.759017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-metrics-tls\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.759177 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-trusted-ca\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.759313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.759354 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vct\" (UniqueName: \"kubernetes.io/projected/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-kube-api-access-c9vct\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.761461 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-trusted-ca\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.762016 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-metrics-tls\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.766831 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.781993 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.802899 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.822984 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.844158 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.864037 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.884256 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.904673 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.923601 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.944459 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.963654 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 22 02:55:07 crc kubenswrapper[4922]: I1122 02:55:07.983501 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.003136 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.023808 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.042872 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.063450 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.082393 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.105935 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.123508 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.143585 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.165208 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.183137 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.202662 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.222381 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.243221 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.262607 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.283057 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.302662 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.343024 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.362865 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.383572 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.402521 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.422769 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.443280 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.463324 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.482568 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.502613 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.522098 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.542664 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.562695 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.581833 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.600783 4922 request.go:700] Waited for 1.002375004s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-metrics-certs-default&limit=500&resourceVersion=0 Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.602712 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.622717 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.642590 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.663662 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.683828 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.703305 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.722205 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.742314 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.763435 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.782826 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.803007 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.824834 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.844012 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.877161 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.883323 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.903464 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.923768 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.944076 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.963747 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 22 02:55:08 crc kubenswrapper[4922]: I1122 02:55:08.983487 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.003518 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.023175 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.043289 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.063297 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.082969 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.103532 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.123031 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.143249 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.163576 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.183013 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.203725 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.223396 4922 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.243759 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.263250 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.282811 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.338330 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzch\" (UniqueName: \"kubernetes.io/projected/a96d295d-cf93-4a96-ac7d-cc85ed8221da-kube-api-access-mfzch\") pod \"apiserver-76f77b778f-mmrpx\" (UID: \"a96d295d-cf93-4a96-ac7d-cc85ed8221da\") " pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.368868 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlq74\" (UniqueName: \"kubernetes.io/projected/1844c05f-b2a6-4abc-b4db-89223a5e6d60-kube-api-access-zlq74\") pod \"authentication-operator-69f744f599-k5v9d\" (UID: \"1844c05f-b2a6-4abc-b4db-89223a5e6d60\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.380936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v849m\" (UniqueName: \"kubernetes.io/projected/9109383b-40f8-49d7-a601-1d048c4d8686-kube-api-access-v849m\") pod \"machine-api-operator-5694c8668f-qqgzd\" (UID: \"9109383b-40f8-49d7-a601-1d048c4d8686\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.403701 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbj65\" (UniqueName: \"kubernetes.io/projected/9507632c-3232-44dc-a75f-1275a2f57145-kube-api-access-fbj65\") pod \"controller-manager-879f6c89f-gsfk7\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.424759 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxxl2\" (UniqueName: \"kubernetes.io/projected/f267ec6b-64da-4065-b3aa-2e66ac957118-kube-api-access-pxxl2\") pod \"oauth-openshift-558db77b4-7zzh7\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.444555 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5z6q\" (UniqueName: \"kubernetes.io/projected/b6ef9de0-f249-438a-94c0-0a359bd88889-kube-api-access-t5z6q\") pod \"openshift-apiserver-operator-796bbdcf4f-skmr7\" (UID: \"b6ef9de0-f249-438a-94c0-0a359bd88889\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.463383 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8wj\" (UniqueName: \"kubernetes.io/projected/5289b0a6-6c1f-4fb4-972f-552899994896-kube-api-access-cg8wj\") pod \"cluster-samples-operator-665b6dd947-p6ktw\" (UID: \"5289b0a6-6c1f-4fb4-972f-552899994896\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.477418 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sl6c\" (UniqueName: \"kubernetes.io/projected/e4cd7913-ec11-4aa6-9351-23821b3cfbcd-kube-api-access-2sl6c\") pod \"machine-approver-56656f9798-ndkq8\" (UID: \"e4cd7913-ec11-4aa6-9351-23821b3cfbcd\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.497154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crppb\" (UniqueName: \"kubernetes.io/projected/3a67fd92-1760-46af-a0b4-0a52c965c63e-kube-api-access-crppb\") pod \"console-operator-58897d9998-mwb28\" (UID: \"3a67fd92-1760-46af-a0b4-0a52c965c63e\") " pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.502884 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.503908 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.517319 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.523819 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.543256 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.553348 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.563318 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.574632 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.583179 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.593636 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.601298 4922 request.go:700] Waited for 1.913137418s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.603666 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.606158 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.624005 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.644029 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.646495 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.661618 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.663087 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.674908 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.684483 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.727433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.738369 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vct\" (UniqueName: \"kubernetes.io/projected/c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3-kube-api-access-c9vct\") pod \"ingress-operator-5b745b69d9-lns87\" (UID: \"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785428 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-registry-certificates\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785468 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q4l8\" (UniqueName: \"kubernetes.io/projected/cf220517-af5c-4186-9439-585c793c64e3-kube-api-access-2q4l8\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc859e82-4e51-4cef-844a-17ada349c77a-metrics-tls\") pod \"dns-operator-744455d44c-6rz2q\" (UID: \"bc859e82-4e51-4cef-844a-17ada349c77a\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785505 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9twp5\" (UniqueName: \"kubernetes.io/projected/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-kube-api-access-9twp5\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785543 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf220517-af5c-4186-9439-585c793c64e3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785560 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564a21fe-1256-4854-b5c1-3d407a4e9aaf-serving-cert\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrpx\" (UniqueName: \"kubernetes.io/projected/89932301-e045-432f-9b1e-d88d5c420fdf-kube-api-access-kfrpx\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785863 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-trusted-ca\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-config\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.785920 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89932301-e045-432f-9b1e-d88d5c420fdf-serving-cert\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786352 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14638a05-727a-441a-88f2-f9750aa17a39-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786371 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mmrpx"] Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786419 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786452 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-encryption-config\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-serving-cert\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-serving-cert\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786525 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bd09df5-3f39-4aec-8ff9-35050ec873bd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786546 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-config\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786568 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786636 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gztw5\" (UniqueName: \"kubernetes.io/projected/0f5e24eb-19ec-4a6e-9b72-ded8f180b673-kube-api-access-gztw5\") pod \"downloads-7954f5f757-n88n6\" (UID: \"0f5e24eb-19ec-4a6e-9b72-ded8f180b673\") " pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786653 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd09df5-3f39-4aec-8ff9-35050ec873bd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786674 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-client\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786711 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-client-ca\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786731 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf220517-af5c-4186-9439-585c793c64e3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-serving-cert\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786797 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqkh\" (UniqueName: \"kubernetes.io/projected/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-kube-api-access-2mqkh\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786815 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-oauth-serving-cert\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786834 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786889 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786920 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-audit-policies\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786939 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771457a9-fee5-496f-88be-b6cea42cc92a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.786994 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-oauth-config\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787012 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvvp\" (UniqueName: \"kubernetes.io/projected/e5063ff0-adff-488b-9c11-595be817e952-kube-api-access-8mvvp\") pod \"migrator-59844c95c7-lvltf\" (UID: \"e5063ff0-adff-488b-9c11-595be817e952\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" Nov 22 02:55:09 crc kubenswrapper[4922]: E1122 02:55:09.787355 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.287340679 +0000 UTC m=+146.325862571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787800 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14638a05-727a-441a-88f2-f9750aa17a39-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787884 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787910 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlw89\" (UniqueName: \"kubernetes.io/projected/564a21fe-1256-4854-b5c1-3d407a4e9aaf-kube-api-access-qlw89\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787949 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771457a9-fee5-496f-88be-b6cea42cc92a-config\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787965 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-service-ca\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.787994 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2r6j\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-kube-api-access-x2r6j\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.788129 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlcnm\" (UniqueName: \"kubernetes.io/projected/e84c0c00-7fd0-44eb-b719-57fca6572497-kube-api-access-qlcnm\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.791584 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-ca\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.791627 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw7h\" (UniqueName: \"kubernetes.io/projected/bc859e82-4e51-4cef-844a-17ada349c77a-kube-api-access-jvw7h\") pod \"dns-operator-744455d44c-6rz2q\" (UID: \"bc859e82-4e51-4cef-844a-17ada349c77a\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.791651 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792173 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-etcd-client\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792527 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792558 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e84c0c00-7fd0-44eb-b719-57fca6572497-audit-dir\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-registry-tls\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792643 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd09df5-3f39-4aec-8ff9-35050ec873bd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792668 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-service-ca\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792685 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/564a21fe-1256-4854-b5c1-3d407a4e9aaf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792735 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-config\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-bound-sa-token\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792826 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-trusted-ca-bundle\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctps\" (UniqueName: \"kubernetes.io/projected/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-kube-api-access-tctps\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792887 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/771457a9-fee5-496f-88be-b6cea42cc92a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.792906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvk9\" (UniqueName: \"kubernetes.io/projected/136fdcc5-9b23-442a-85e0-96129d4aed8a-kube-api-access-jkvk9\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.805757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" Nov 22 02:55:09 crc kubenswrapper[4922]: W1122 02:55:09.813354 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96d295d_cf93_4a96_ac7d_cc85ed8221da.slice/crio-d392b0ea25754ef9861b8fde34c91d953c80b7ad0ea4be3f037b012696cd24bb WatchSource:0}: Error finding container d392b0ea25754ef9861b8fde34c91d953c80b7ad0ea4be3f037b012696cd24bb: Status 404 returned error can't find the container with id d392b0ea25754ef9861b8fde34c91d953c80b7ad0ea4be3f037b012696cd24bb Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.894539 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895053 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4zkk\" (UniqueName: \"kubernetes.io/projected/870b8608-cc44-4883-bc2f-a889dddf467e-kube-api-access-s4zkk\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895150 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-stats-auth\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895203 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5c7d8c-ecfb-40af-990a-7bb86b045533-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-config\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895278 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmfbv\" (UniqueName: \"kubernetes.io/projected/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-kube-api-access-wmfbv\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895307 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gztw5\" (UniqueName: \"kubernetes.io/projected/0f5e24eb-19ec-4a6e-9b72-ded8f180b673-kube-api-access-gztw5\") pod \"downloads-7954f5f757-n88n6\" (UID: \"0f5e24eb-19ec-4a6e-9b72-ded8f180b673\") " pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895353 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895379 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd09df5-3f39-4aec-8ff9-35050ec873bd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1a9735e4-9fac-4075-afce-68bb38057c12-certs\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895475 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-client-ca\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-oauth-serving-cert\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895551 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-registration-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895599 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895627 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mvvp\" (UniqueName: \"kubernetes.io/projected/e5063ff0-adff-488b-9c11-595be817e952-kube-api-access-8mvvp\") pod \"migrator-59844c95c7-lvltf\" (UID: \"e5063ff0-adff-488b-9c11-595be817e952\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt6z\" (UniqueName: \"kubernetes.io/projected/1a9735e4-9fac-4075-afce-68bb38057c12-kube-api-access-pxt6z\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bbpq\" (UniqueName: \"kubernetes.io/projected/156920b9-f91f-4053-be05-3be9c55f09b1-kube-api-access-2bbpq\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14638a05-727a-441a-88f2-f9750aa17a39-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895792 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771457a9-fee5-496f-88be-b6cea42cc92a-config\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895835 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-service-ca\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895876 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/513c0c01-5777-4fb2-b36d-d35251f0c33e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hjlwb\" (UID: \"513c0c01-5777-4fb2-b36d-d35251f0c33e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895923 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2r6j\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-kube-api-access-x2r6j\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895947 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-proxy-tls\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.895968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwrlx\" (UniqueName: \"kubernetes.io/projected/936d6833-0d47-4fdd-957a-b85f6dcd186c-kube-api-access-kwrlx\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896015 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25a50f57-bc62-4fc0-8410-fc80535adb55-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896037 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-metrics-certs\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896083 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/936d6833-0d47-4fdd-957a-b85f6dcd186c-apiservice-cert\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896109 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44sg\" (UniqueName: \"kubernetes.io/projected/a907e262-3b15-4019-a715-c45b0e12ac27-kube-api-access-g44sg\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896156 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw7h\" (UniqueName: \"kubernetes.io/projected/bc859e82-4e51-4cef-844a-17ada349c77a-kube-api-access-jvw7h\") pod \"dns-operator-744455d44c-6rz2q\" (UID: \"bc859e82-4e51-4cef-844a-17ada349c77a\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896184 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1a9735e4-9fac-4075-afce-68bb38057c12-node-bootstrap-token\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896207 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gxb\" (UniqueName: \"kubernetes.io/projected/abb64422-1cc8-4858-a3fa-ca843be8687c-kube-api-access-d5gxb\") pod \"ingress-canary-4kqxp\" (UID: \"abb64422-1cc8-4858-a3fa-ca843be8687c\") " pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896254 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/936d6833-0d47-4fdd-957a-b85f6dcd186c-webhook-cert\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e84c0c00-7fd0-44eb-b719-57fca6572497-audit-dir\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896331 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-default-certificate\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896357 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhvx2\" (UniqueName: \"kubernetes.io/projected/4238f8cf-8e49-464b-9b6b-3f93b015747b-kube-api-access-xhvx2\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abb64422-1cc8-4858-a3fa-ca843be8687c-cert\") pod \"ingress-canary-4kqxp\" (UID: \"abb64422-1cc8-4858-a3fa-ca843be8687c\") " pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896432 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9739869-8d63-4aaf-8f03-1605eed08ceb-srv-cert\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896450 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrpk7\" (UniqueName: \"kubernetes.io/projected/25a50f57-bc62-4fc0-8410-fc80535adb55-kube-api-access-jrpk7\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896470 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd09df5-3f39-4aec-8ff9-35050ec873bd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896548 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfrnc\" (UniqueName: \"kubernetes.io/projected/62cd80a8-6faf-48b7-bf44-3b181afd66c6-kube-api-access-gfrnc\") pod \"control-plane-machine-set-operator-78cbb6b69f-xlb7z\" (UID: \"62cd80a8-6faf-48b7-bf44-3b181afd66c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896571 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mvx2\" (UniqueName: \"kubernetes.io/projected/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-kube-api-access-2mvx2\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896595 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/564a21fe-1256-4854-b5c1-3d407a4e9aaf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896617 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-config\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896670 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvk9\" (UniqueName: \"kubernetes.io/projected/136fdcc5-9b23-442a-85e0-96129d4aed8a-kube-api-access-jkvk9\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896696 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/859cd128-2a8a-48e3-b3af-eb0147864b94-srv-cert\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896901 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-secret-volume\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896933 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896963 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctps\" (UniqueName: \"kubernetes.io/projected/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-kube-api-access-tctps\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.896988 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/771457a9-fee5-496f-88be-b6cea42cc92a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.897014 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5c7d8c-ecfb-40af-990a-7bb86b045533-config\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.897042 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-config-volume\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.897071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9twp5\" (UniqueName: \"kubernetes.io/projected/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-kube-api-access-9twp5\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: E1122 02:55:09.897265 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.397235011 +0000 UTC m=+146.435756903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.897947 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-config\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.898002 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/564a21fe-1256-4854-b5c1-3d407a4e9aaf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.898358 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.898464 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14638a05-727a-441a-88f2-f9750aa17a39-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.898993 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/859cd128-2a8a-48e3-b3af-eb0147864b94-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899055 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc859e82-4e51-4cef-844a-17ada349c77a-metrics-tls\") pod \"dns-operator-744455d44c-6rz2q\" (UID: \"bc859e82-4e51-4cef-844a-17ada349c77a\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899140 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc01606-1661-41db-9a31-949d68a1548e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4vw74\" (UID: \"2dc01606-1661-41db-9a31-949d68a1548e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899178 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf220517-af5c-4186-9439-585c793c64e3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899204 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771457a9-fee5-496f-88be-b6cea42cc92a-config\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899213 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564a21fe-1256-4854-b5c1-3d407a4e9aaf-serving-cert\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899286 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlrzb\" (UniqueName: \"kubernetes.io/projected/ca284dd6-f710-44d3-aa30-a493af7ff84f-kube-api-access-wlrzb\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899320 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-socket-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76qk\" (UniqueName: \"kubernetes.io/projected/c9739869-8d63-4aaf-8f03-1605eed08ceb-kube-api-access-x76qk\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899378 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-config\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899402 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c02715b-5f00-464a-85e3-4df3043304d6-service-ca-bundle\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899422 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-mountpoint-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899443 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870b8608-cc44-4883-bc2f-a889dddf467e-serving-cert\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14638a05-727a-441a-88f2-f9750aa17a39-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899506 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4238f8cf-8e49-464b-9b6b-3f93b015747b-metrics-tls\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899530 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-encryption-config\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899553 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-serving-cert\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bd09df5-3f39-4aec-8ff9-35050ec873bd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899604 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-serving-cert\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-client\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899645 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-images\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899704 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-serving-cert\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqkh\" (UniqueName: \"kubernetes.io/projected/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-kube-api-access-2mqkh\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899735 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-service-ca\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899749 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/936d6833-0d47-4fdd-957a-b85f6dcd186c-tmpfs\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25a50f57-bc62-4fc0-8410-fc80535adb55-proxy-tls\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899824 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf220517-af5c-4186-9439-585c793c64e3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-audit-policies\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899893 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899914 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771457a9-fee5-496f-88be-b6cea42cc92a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899936 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-oauth-config\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899958 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-plugins-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.899987 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900013 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-csi-data-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900037 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900062 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlw89\" (UniqueName: \"kubernetes.io/projected/564a21fe-1256-4854-b5c1-3d407a4e9aaf-kube-api-access-qlw89\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5rb\" (UniqueName: \"kubernetes.io/projected/859cd128-2a8a-48e3-b3af-eb0147864b94-kube-api-access-vj5rb\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900123 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlcnm\" (UniqueName: \"kubernetes.io/projected/e84c0c00-7fd0-44eb-b719-57fca6572497-kube-api-access-qlcnm\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900154 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-ca\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900188 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900196 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bd09df5-3f39-4aec-8ff9-35050ec873bd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt25t\" (UniqueName: \"kubernetes.io/projected/2c02715b-5f00-464a-85e3-4df3043304d6-kube-api-access-pt25t\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900226 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e84c0c00-7fd0-44eb-b719-57fca6572497-audit-dir\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900258 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-etcd-client\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900333 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900392 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca284dd6-f710-44d3-aa30-a493af7ff84f-signing-cabundle\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900441 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-registry-tls\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900471 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rzl9\" (UniqueName: \"kubernetes.io/projected/513c0c01-5777-4fb2-b36d-d35251f0c33e-kube-api-access-2rzl9\") pod \"multus-admission-controller-857f4d67dd-hjlwb\" (UID: \"513c0c01-5777-4fb2-b36d-d35251f0c33e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcj2\" (UniqueName: \"kubernetes.io/projected/2dc01606-1661-41db-9a31-949d68a1548e-kube-api-access-sdcj2\") pod \"package-server-manager-789f6589d5-4vw74\" (UID: \"2dc01606-1661-41db-9a31-949d68a1548e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4238f8cf-8e49-464b-9b6b-3f93b015747b-config-volume\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900352 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-config\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-service-ca\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900693 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870b8608-cc44-4883-bc2f-a889dddf467e-config\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900755 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-bound-sa-token\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-trusted-ca-bundle\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900812 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62cd80a8-6faf-48b7-bf44-3b181afd66c6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xlb7z\" (UID: \"62cd80a8-6faf-48b7-bf44-3b181afd66c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900876 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-oauth-serving-cert\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.900920 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf220517-af5c-4186-9439-585c793c64e3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.901172 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-config\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.901238 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-client-ca\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.901435 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-audit-policies\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.902017 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd09df5-3f39-4aec-8ff9-35050ec873bd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.903716 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/564a21fe-1256-4854-b5c1-3d407a4e9aaf-serving-cert\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905105 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-registry-certificates\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905119 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e84c0c00-7fd0-44eb-b719-57fca6572497-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q4l8\" (UniqueName: \"kubernetes.io/projected/cf220517-af5c-4186-9439-585c793c64e3-kube-api-access-2q4l8\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9739869-8d63-4aaf-8f03-1605eed08ceb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905218 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca284dd6-f710-44d3-aa30-a493af7ff84f-signing-key\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-service-ca\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.905980 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-encryption-config\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.906248 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-trusted-ca-bundle\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: E1122 02:55:09.906480 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.406466344 +0000 UTC m=+146.444988226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.906611 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-client\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.906770 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.906875 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrpx\" (UniqueName: \"kubernetes.io/projected/89932301-e045-432f-9b1e-d88d5c420fdf-kube-api-access-kfrpx\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.906981 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-etcd-client\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.907071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-trusted-ca\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.907125 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.907225 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd5c7d8c-ecfb-40af-990a-7bb86b045533-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.907292 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89932301-e045-432f-9b1e-d88d5c420fdf-serving-cert\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.908137 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.908545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-registry-certificates\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.908587 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/89932301-e045-432f-9b1e-d88d5c420fdf-etcd-ca\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.909043 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-trusted-ca\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.911224 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14638a05-727a-441a-88f2-f9750aa17a39-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.911290 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc859e82-4e51-4cef-844a-17ada349c77a-metrics-tls\") pod \"dns-operator-744455d44c-6rz2q\" (UID: \"bc859e82-4e51-4cef-844a-17ada349c77a\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.911602 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf220517-af5c-4186-9439-585c793c64e3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.911678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-oauth-config\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.911698 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89932301-e045-432f-9b1e-d88d5c420fdf-serving-cert\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.912006 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e84c0c00-7fd0-44eb-b719-57fca6572497-serving-cert\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.913703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-serving-cert\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.916522 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-serving-cert\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.918456 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-registry-tls\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.922881 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771457a9-fee5-496f-88be-b6cea42cc92a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.936838 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5v9d"] Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.955983 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:09 crc kubenswrapper[4922]: I1122 02:55:09.969561 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.004171 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mvvp\" (UniqueName: \"kubernetes.io/projected/e5063ff0-adff-488b-9c11-595be817e952-kube-api-access-8mvvp\") pod \"migrator-59844c95c7-lvltf\" (UID: \"e5063ff0-adff-488b-9c11-595be817e952\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.006389 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvk9\" (UniqueName: \"kubernetes.io/projected/136fdcc5-9b23-442a-85e0-96129d4aed8a-kube-api-access-jkvk9\") pod \"console-f9d7485db-dj4jp\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008310 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.008431 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.508401502 +0000 UTC m=+146.546923394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008528 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4zkk\" (UniqueName: \"kubernetes.io/projected/870b8608-cc44-4883-bc2f-a889dddf467e-kube-api-access-s4zkk\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008558 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-stats-auth\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008579 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5c7d8c-ecfb-40af-990a-7bb86b045533-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008598 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmfbv\" (UniqueName: \"kubernetes.io/projected/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-kube-api-access-wmfbv\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008625 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1a9735e4-9fac-4075-afce-68bb38057c12-certs\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-registration-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008663 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt6z\" (UniqueName: \"kubernetes.io/projected/1a9735e4-9fac-4075-afce-68bb38057c12-kube-api-access-pxt6z\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008678 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bbpq\" (UniqueName: \"kubernetes.io/projected/156920b9-f91f-4053-be05-3be9c55f09b1-kube-api-access-2bbpq\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008696 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/513c0c01-5777-4fb2-b36d-d35251f0c33e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hjlwb\" (UID: \"513c0c01-5777-4fb2-b36d-d35251f0c33e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008719 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-proxy-tls\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008738 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-metrics-certs\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008757 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/936d6833-0d47-4fdd-957a-b85f6dcd186c-apiservice-cert\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwrlx\" (UniqueName: \"kubernetes.io/projected/936d6833-0d47-4fdd-957a-b85f6dcd186c-kube-api-access-kwrlx\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008792 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25a50f57-bc62-4fc0-8410-fc80535adb55-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008810 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44sg\" (UniqueName: \"kubernetes.io/projected/a907e262-3b15-4019-a715-c45b0e12ac27-kube-api-access-g44sg\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008837 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1a9735e4-9fac-4075-afce-68bb38057c12-node-bootstrap-token\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gxb\" (UniqueName: \"kubernetes.io/projected/abb64422-1cc8-4858-a3fa-ca843be8687c-kube-api-access-d5gxb\") pod \"ingress-canary-4kqxp\" (UID: \"abb64422-1cc8-4858-a3fa-ca843be8687c\") " pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008886 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-default-certificate\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008901 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/936d6833-0d47-4fdd-957a-b85f6dcd186c-webhook-cert\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008917 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abb64422-1cc8-4858-a3fa-ca843be8687c-cert\") pod \"ingress-canary-4kqxp\" (UID: \"abb64422-1cc8-4858-a3fa-ca843be8687c\") " pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008932 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhvx2\" (UniqueName: \"kubernetes.io/projected/4238f8cf-8e49-464b-9b6b-3f93b015747b-kube-api-access-xhvx2\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008949 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfrnc\" (UniqueName: \"kubernetes.io/projected/62cd80a8-6faf-48b7-bf44-3b181afd66c6-kube-api-access-gfrnc\") pod \"control-plane-machine-set-operator-78cbb6b69f-xlb7z\" (UID: \"62cd80a8-6faf-48b7-bf44-3b181afd66c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008964 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9739869-8d63-4aaf-8f03-1605eed08ceb-srv-cert\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008980 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrpk7\" (UniqueName: \"kubernetes.io/projected/25a50f57-bc62-4fc0-8410-fc80535adb55-kube-api-access-jrpk7\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.008995 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009016 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mvx2\" (UniqueName: \"kubernetes.io/projected/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-kube-api-access-2mvx2\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009046 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/859cd128-2a8a-48e3-b3af-eb0147864b94-srv-cert\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009062 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-secret-volume\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009079 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-config-volume\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009110 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5c7d8c-ecfb-40af-990a-7bb86b045533-config\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/859cd128-2a8a-48e3-b3af-eb0147864b94-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009150 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc01606-1661-41db-9a31-949d68a1548e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4vw74\" (UID: \"2dc01606-1661-41db-9a31-949d68a1548e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009165 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-socket-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009181 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlrzb\" (UniqueName: \"kubernetes.io/projected/ca284dd6-f710-44d3-aa30-a493af7ff84f-kube-api-access-wlrzb\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009199 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76qk\" (UniqueName: \"kubernetes.io/projected/c9739869-8d63-4aaf-8f03-1605eed08ceb-kube-api-access-x76qk\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009217 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c02715b-5f00-464a-85e3-4df3043304d6-service-ca-bundle\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009233 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-mountpoint-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009251 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870b8608-cc44-4883-bc2f-a889dddf467e-serving-cert\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009266 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4238f8cf-8e49-464b-9b6b-3f93b015747b-metrics-tls\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009304 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-images\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009333 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/936d6833-0d47-4fdd-957a-b85f6dcd186c-tmpfs\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009360 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25a50f57-bc62-4fc0-8410-fc80535adb55-proxy-tls\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009388 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009407 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-plugins-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009420 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-csi-data-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009446 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5rb\" (UniqueName: \"kubernetes.io/projected/859cd128-2a8a-48e3-b3af-eb0147864b94-kube-api-access-vj5rb\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009468 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt25t\" (UniqueName: \"kubernetes.io/projected/2c02715b-5f00-464a-85e3-4df3043304d6-kube-api-access-pt25t\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009491 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca284dd6-f710-44d3-aa30-a493af7ff84f-signing-cabundle\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009529 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rzl9\" (UniqueName: \"kubernetes.io/projected/513c0c01-5777-4fb2-b36d-d35251f0c33e-kube-api-access-2rzl9\") pod \"multus-admission-controller-857f4d67dd-hjlwb\" (UID: \"513c0c01-5777-4fb2-b36d-d35251f0c33e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcj2\" (UniqueName: \"kubernetes.io/projected/2dc01606-1661-41db-9a31-949d68a1548e-kube-api-access-sdcj2\") pod \"package-server-manager-789f6589d5-4vw74\" (UID: \"2dc01606-1661-41db-9a31-949d68a1548e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009573 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4238f8cf-8e49-464b-9b6b-3f93b015747b-config-volume\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009591 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870b8608-cc44-4883-bc2f-a889dddf467e-config\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62cd80a8-6faf-48b7-bf44-3b181afd66c6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xlb7z\" (UID: \"62cd80a8-6faf-48b7-bf44-3b181afd66c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009632 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/771457a9-fee5-496f-88be-b6cea42cc92a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7fgw4\" (UID: \"771457a9-fee5-496f-88be-b6cea42cc92a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9739869-8d63-4aaf-8f03-1605eed08ceb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009733 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca284dd6-f710-44d3-aa30-a493af7ff84f-signing-key\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009760 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-registration-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009784 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.009810 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd5c7d8c-ecfb-40af-990a-7bb86b045533-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.010153 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-mountpoint-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.010232 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-socket-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.010451 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.51043965 +0000 UTC m=+146.548961542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.012893 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/936d6833-0d47-4fdd-957a-b85f6dcd186c-tmpfs\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.015430 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-plugins-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.015536 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a907e262-3b15-4019-a715-c45b0e12ac27-csi-data-dir\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.015966 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.016426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca284dd6-f710-44d3-aa30-a493af7ff84f-signing-cabundle\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.017103 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-images\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.017234 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4238f8cf-8e49-464b-9b6b-3f93b015747b-config-volume\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.017323 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4238f8cf-8e49-464b-9b6b-3f93b015747b-metrics-tls\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.017535 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-proxy-tls\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.017805 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870b8608-cc44-4883-bc2f-a889dddf467e-config\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.018212 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca284dd6-f710-44d3-aa30-a493af7ff84f-signing-key\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.018588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/25a50f57-bc62-4fc0-8410-fc80535adb55-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.019569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2dc01606-1661-41db-9a31-949d68a1548e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4vw74\" (UID: \"2dc01606-1661-41db-9a31-949d68a1548e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.025809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/62cd80a8-6faf-48b7-bf44-3b181afd66c6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xlb7z\" (UID: \"62cd80a8-6faf-48b7-bf44-3b181afd66c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.026470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9739869-8d63-4aaf-8f03-1605eed08ceb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.026800 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsfk7"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.026948 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-metrics-certs\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.027239 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/513c0c01-5777-4fb2-b36d-d35251f0c33e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hjlwb\" (UID: \"513c0c01-5777-4fb2-b36d-d35251f0c33e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.027703 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-config-volume\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.027992 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctps\" (UniqueName: \"kubernetes.io/projected/dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7-kube-api-access-tctps\") pod \"cluster-image-registry-operator-dc59b4c8b-7lpgh\" (UID: \"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.028019 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd5c7d8c-ecfb-40af-990a-7bb86b045533-config\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.028161 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c02715b-5f00-464a-85e3-4df3043304d6-service-ca-bundle\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.028937 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.029594 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1a9735e4-9fac-4075-afce-68bb38057c12-node-bootstrap-token\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.030031 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9739869-8d63-4aaf-8f03-1605eed08ceb-srv-cert\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.030033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25a50f57-bc62-4fc0-8410-fc80535adb55-proxy-tls\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.030136 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-stats-auth\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.030138 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/859cd128-2a8a-48e3-b3af-eb0147864b94-srv-cert\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.030482 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870b8608-cc44-4883-bc2f-a889dddf467e-serving-cert\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.030572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c02715b-5f00-464a-85e3-4df3043304d6-default-certificate\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.031030 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd5c7d8c-ecfb-40af-990a-7bb86b045533-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.031105 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/859cd128-2a8a-48e3-b3af-eb0147864b94-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.031222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1a9735e4-9fac-4075-afce-68bb38057c12-certs\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.031426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/936d6833-0d47-4fdd-957a-b85f6dcd186c-webhook-cert\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.031513 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.033147 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-secret-volume\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.034228 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abb64422-1cc8-4858-a3fa-ca843be8687c-cert\") pod \"ingress-canary-4kqxp\" (UID: \"abb64422-1cc8-4858-a3fa-ca843be8687c\") " pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.035164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/936d6833-0d47-4fdd-957a-b85f6dcd186c-apiservice-cert\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.042078 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gztw5\" (UniqueName: \"kubernetes.io/projected/0f5e24eb-19ec-4a6e-9b72-ded8f180b673-kube-api-access-gztw5\") pod \"downloads-7954f5f757-n88n6\" (UID: \"0f5e24eb-19ec-4a6e-9b72-ded8f180b673\") " pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.062146 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qqgzd"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.062755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.075515 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9twp5\" (UniqueName: \"kubernetes.io/projected/c6aceb43-ebd5-4a06-b702-e8236bc7de2d-kube-api-access-9twp5\") pod \"kube-storage-version-migrator-operator-b67b599dd-vmg24\" (UID: \"c6aceb43-ebd5-4a06-b702-e8236bc7de2d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.088353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2r6j\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-kube-api-access-x2r6j\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.090641 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.092926 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7zzh7"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.094966 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lns87"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.095422 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mwb28"] Nov 22 02:55:10 crc kubenswrapper[4922]: W1122 02:55:10.099685 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ef9de0_f249_438a_94c0_0a359bd88889.slice/crio-d19f124020860710156a6dd40b3a9400d0ccfccd3a81cda73dd65cc7e29c0be4 WatchSource:0}: Error finding container d19f124020860710156a6dd40b3a9400d0ccfccd3a81cda73dd65cc7e29c0be4: Status 404 returned error can't find the container with id d19f124020860710156a6dd40b3a9400d0ccfccd3a81cda73dd65cc7e29c0be4 Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.104901 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw7h\" (UniqueName: \"kubernetes.io/projected/bc859e82-4e51-4cef-844a-17ada349c77a-kube-api-access-jvw7h\") pod \"dns-operator-744455d44c-6rz2q\" (UID: \"bc859e82-4e51-4cef-844a-17ada349c77a\") " pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.110438 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.110577 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.610531544 +0000 UTC m=+146.649053436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.110800 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.112460 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.612434889 +0000 UTC m=+146.650956871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: W1122 02:55:10.114356 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a67fd92_1760_46af_a0b4_0a52c965c63e.slice/crio-cf1904f502b333fd2eac98cb576368f53e46e7615c7edb16e3e7c5f9f671844c WatchSource:0}: Error finding container cf1904f502b333fd2eac98cb576368f53e46e7615c7edb16e3e7c5f9f671844c: Status 404 returned error can't find the container with id cf1904f502b333fd2eac98cb576368f53e46e7615c7edb16e3e7c5f9f671844c Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.115183 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.116190 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqkh\" (UniqueName: \"kubernetes.io/projected/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-kube-api-access-2mqkh\") pod \"route-controller-manager-6576b87f9c-ksllc\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.124289 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.136706 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.139337 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlcnm\" (UniqueName: \"kubernetes.io/projected/e84c0c00-7fd0-44eb-b719-57fca6572497-kube-api-access-qlcnm\") pod \"apiserver-7bbb656c7d-bfqj8\" (UID: \"e84c0c00-7fd0-44eb-b719-57fca6572497\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.144730 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.151019 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.157646 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.158254 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlw89\" (UniqueName: \"kubernetes.io/projected/564a21fe-1256-4854-b5c1-3d407a4e9aaf-kube-api-access-qlw89\") pod \"openshift-config-operator-7777fb866f-56bkw\" (UID: \"564a21fe-1256-4854-b5c1-3d407a4e9aaf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.187270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-bound-sa-token\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.202151 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bd09df5-3f39-4aec-8ff9-35050ec873bd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5gzxz\" (UID: \"0bd09df5-3f39-4aec-8ff9-35050ec873bd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.211315 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.211858 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.711830288 +0000 UTC m=+146.750352180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.215728 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q4l8\" (UniqueName: \"kubernetes.io/projected/cf220517-af5c-4186-9439-585c793c64e3-kube-api-access-2q4l8\") pod \"openshift-controller-manager-operator-756b6f6bc6-d25rh\" (UID: \"cf220517-af5c-4186-9439-585c793c64e3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.224337 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" event={"ID":"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3","Type":"ContainerStarted","Data":"d30f86eaa9a3a5f7623314f5159f4440bd8a9def2f1703f22bddabb2f983ae64"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.228332 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" event={"ID":"9507632c-3232-44dc-a75f-1275a2f57145","Type":"ContainerStarted","Data":"18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.228403 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" event={"ID":"9507632c-3232-44dc-a75f-1275a2f57145","Type":"ContainerStarted","Data":"3a9d03f00ccfa0c8348c2c7ae6521c789c76fba3a3b7f1e5e9d14673f9cb31ff"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.228423 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.229529 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gsfk7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.229575 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" podUID="9507632c-3232-44dc-a75f-1275a2f57145" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.247455 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" event={"ID":"1844c05f-b2a6-4abc-b4db-89223a5e6d60","Type":"ContainerStarted","Data":"e74ed099d20424082125e39c1d60a19252f30ee2280044f05ef3a28cab44e2ed"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.247499 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" event={"ID":"1844c05f-b2a6-4abc-b4db-89223a5e6d60","Type":"ContainerStarted","Data":"75216affe63d6c4b6fce2cb3cc856f667d2bff67721a0c5b05adb6f3cf93601a"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.249869 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrpx\" (UniqueName: \"kubernetes.io/projected/89932301-e045-432f-9b1e-d88d5c420fdf-kube-api-access-kfrpx\") pod \"etcd-operator-b45778765-j2nbv\" (UID: \"89932301-e045-432f-9b1e-d88d5c420fdf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.257154 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" event={"ID":"5289b0a6-6c1f-4fb4-972f-552899994896","Type":"ContainerStarted","Data":"309ba566b89ed05b288a91485cc387982748dbceadf2f114ad3839c9f3bd18c0"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.283179 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" event={"ID":"e4cd7913-ec11-4aa6-9351-23821b3cfbcd","Type":"ContainerStarted","Data":"d38647a26d4223092aee753c72ad64aaf35527dc3f9234df6ab0059af6576435"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.283228 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" event={"ID":"e4cd7913-ec11-4aa6-9351-23821b3cfbcd","Type":"ContainerStarted","Data":"d66c64bce3e78cde0c60875a7b78e66a546558687c660b62e8e90310dc6f354a"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.288025 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4zkk\" (UniqueName: \"kubernetes.io/projected/870b8608-cc44-4883-bc2f-a889dddf467e-kube-api-access-s4zkk\") pod \"service-ca-operator-777779d784-nxqd9\" (UID: \"870b8608-cc44-4883-bc2f-a889dddf467e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.288324 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.288349 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" event={"ID":"b6ef9de0-f249-438a-94c0-0a359bd88889","Type":"ContainerStarted","Data":"d19f124020860710156a6dd40b3a9400d0ccfccd3a81cda73dd65cc7e29c0be4"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.290711 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mwb28" event={"ID":"3a67fd92-1760-46af-a0b4-0a52c965c63e","Type":"ContainerStarted","Data":"cf1904f502b333fd2eac98cb576368f53e46e7615c7edb16e3e7c5f9f671844c"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.292422 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" event={"ID":"f267ec6b-64da-4065-b3aa-2e66ac957118","Type":"ContainerStarted","Data":"b3b933d0f6880606c77fc6b3d42b6472ba327884ff46f457cf096806d13440d3"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.294088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" event={"ID":"9109383b-40f8-49d7-a601-1d048c4d8686","Type":"ContainerStarted","Data":"ed57a40eb3ca539cce69074efe2dda340d55009cb0e021edb748cafb48acd919"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.296152 4922 generic.go:334] "Generic (PLEG): container finished" podID="a96d295d-cf93-4a96-ac7d-cc85ed8221da" containerID="ddbcd01953c3ff76b0aa967db32dcb36f8ffb6cafacf1330cf870b89c0b6ab3f" exitCode=0 Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.296186 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" event={"ID":"a96d295d-cf93-4a96-ac7d-cc85ed8221da","Type":"ContainerDied","Data":"ddbcd01953c3ff76b0aa967db32dcb36f8ffb6cafacf1330cf870b89c0b6ab3f"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.296202 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" event={"ID":"a96d295d-cf93-4a96-ac7d-cc85ed8221da","Type":"ContainerStarted","Data":"d392b0ea25754ef9861b8fde34c91d953c80b7ad0ea4be3f037b012696cd24bb"} Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.304475 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmfbv\" (UniqueName: \"kubernetes.io/projected/70f825ae-ce20-4eb1-a7dc-1468a552f2d0-kube-api-access-wmfbv\") pod \"machine-config-operator-74547568cd-9jfnp\" (UID: \"70f825ae-ce20-4eb1-a7dc-1468a552f2d0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.305977 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.312929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.313578 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.81356516 +0000 UTC m=+146.852087052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.316911 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.320967 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt6z\" (UniqueName: \"kubernetes.io/projected/1a9735e4-9fac-4075-afce-68bb38057c12-kube-api-access-pxt6z\") pod \"machine-config-server-s8lrv\" (UID: \"1a9735e4-9fac-4075-afce-68bb38057c12\") " pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.326755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.339920 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bbpq\" (UniqueName: \"kubernetes.io/projected/156920b9-f91f-4053-be05-3be9c55f09b1-kube-api-access-2bbpq\") pod \"marketplace-operator-79b997595-cm895\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.346035 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-n88n6"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.346396 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.360908 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.375790 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd5c7d8c-ecfb-40af-990a-7bb86b045533-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-86q9x\" (UID: \"fd5c7d8c-ecfb-40af-990a-7bb86b045533\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.412764 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlrzb\" (UniqueName: \"kubernetes.io/projected/ca284dd6-f710-44d3-aa30-a493af7ff84f-kube-api-access-wlrzb\") pod \"service-ca-9c57cc56f-gjqfl\" (UID: \"ca284dd6-f710-44d3-aa30-a493af7ff84f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.415008 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.415108 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.915092659 +0000 UTC m=+146.953614551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.415575 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.415898 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:10.915887837 +0000 UTC m=+146.954409729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.424172 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76qk\" (UniqueName: \"kubernetes.io/projected/c9739869-8d63-4aaf-8f03-1605eed08ceb-kube-api-access-x76qk\") pod \"olm-operator-6b444d44fb-n89mb\" (UID: \"c9739869-8d63-4aaf-8f03-1605eed08ceb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.430436 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.431380 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5rb\" (UniqueName: \"kubernetes.io/projected/859cd128-2a8a-48e3-b3af-eb0147864b94-kube-api-access-vj5rb\") pod \"catalog-operator-68c6474976-9s7kz\" (UID: \"859cd128-2a8a-48e3-b3af-eb0147864b94\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.465269 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt25t\" (UniqueName: \"kubernetes.io/projected/2c02715b-5f00-464a-85e3-4df3043304d6-kube-api-access-pt25t\") pod \"router-default-5444994796-jb5md\" (UID: \"2c02715b-5f00-464a-85e3-4df3043304d6\") " pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.471744 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rzl9\" (UniqueName: \"kubernetes.io/projected/513c0c01-5777-4fb2-b36d-d35251f0c33e-kube-api-access-2rzl9\") pod \"multus-admission-controller-857f4d67dd-hjlwb\" (UID: \"513c0c01-5777-4fb2-b36d-d35251f0c33e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.471780 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dj4jp"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.482281 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.489392 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcj2\" (UniqueName: \"kubernetes.io/projected/2dc01606-1661-41db-9a31-949d68a1548e-kube-api-access-sdcj2\") pod \"package-server-manager-789f6589d5-4vw74\" (UID: \"2dc01606-1661-41db-9a31-949d68a1548e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.489498 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.497223 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.511738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfrnc\" (UniqueName: \"kubernetes.io/projected/62cd80a8-6faf-48b7-bf44-3b181afd66c6-kube-api-access-gfrnc\") pod \"control-plane-machine-set-operator-78cbb6b69f-xlb7z\" (UID: \"62cd80a8-6faf-48b7-bf44-3b181afd66c6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.512266 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.516377 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.516562 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.016533905 +0000 UTC m=+147.055055797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.516656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.517150 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.017138329 +0000 UTC m=+147.055660221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.520299 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.527053 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.529214 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwrlx\" (UniqueName: \"kubernetes.io/projected/936d6833-0d47-4fdd-957a-b85f6dcd186c-kube-api-access-kwrlx\") pod \"packageserver-d55dfcdfc-4hrnn\" (UID: \"936d6833-0d47-4fdd-957a-b85f6dcd186c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.532789 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.538968 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.548458 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrpk7\" (UniqueName: \"kubernetes.io/projected/25a50f57-bc62-4fc0-8410-fc80535adb55-kube-api-access-jrpk7\") pod \"machine-config-controller-84d6567774-n5wk8\" (UID: \"25a50f57-bc62-4fc0-8410-fc80535adb55\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.549040 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.554491 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.575890 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44sg\" (UniqueName: \"kubernetes.io/projected/a907e262-3b15-4019-a715-c45b0e12ac27-kube-api-access-g44sg\") pod \"csi-hostpathplugin-wxkjb\" (UID: \"a907e262-3b15-4019-a715-c45b0e12ac27\") " pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.588521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhvx2\" (UniqueName: \"kubernetes.io/projected/4238f8cf-8e49-464b-9b6b-3f93b015747b-kube-api-access-xhvx2\") pod \"dns-default-7bmdv\" (UID: \"4238f8cf-8e49-464b-9b6b-3f93b015747b\") " pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.607083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.607154 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.612363 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-s8lrv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.616244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mvx2\" (UniqueName: \"kubernetes.io/projected/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-kube-api-access-2mvx2\") pod \"collect-profiles-29396325-vl7tv\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.618325 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.618910 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.118867201 +0000 UTC m=+147.157389083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.626947 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.629869 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gxb\" (UniqueName: \"kubernetes.io/projected/abb64422-1cc8-4858-a3fa-ca843be8687c-kube-api-access-d5gxb\") pod \"ingress-canary-4kqxp\" (UID: \"abb64422-1cc8-4858-a3fa-ca843be8687c\") " pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.646179 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.720647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.721806 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.221777432 +0000 UTC m=+147.260299514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: W1122 02:55:10.764990 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6aceb43_ebd5_4a06_b702_e8236bc7de2d.slice/crio-39e25fe84a8e33e18e29f8276d1bfe2337f5a0586232eeb30839db072d2ee397 WatchSource:0}: Error finding container 39e25fe84a8e33e18e29f8276d1bfe2337f5a0586232eeb30839db072d2ee397: Status 404 returned error can't find the container with id 39e25fe84a8e33e18e29f8276d1bfe2337f5a0586232eeb30839db072d2ee397 Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.774282 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.805972 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.821293 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.821723 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.321707373 +0000 UTC m=+147.360229265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.838962 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.891661 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.893113 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh"] Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.918723 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4kqxp" Nov 22 02:55:10 crc kubenswrapper[4922]: I1122 02:55:10.924097 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:10 crc kubenswrapper[4922]: E1122 02:55:10.924416 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.424402798 +0000 UTC m=+147.462924680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:10 crc kubenswrapper[4922]: W1122 02:55:10.965321 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1629dd_242e_43e0_9aaa_8ed4e24cb8c7.slice/crio-109b5bc9cd6c73cbe3b62a418757c98372f90ccd31338f59f24d34d709da8302 WatchSource:0}: Error finding container 109b5bc9cd6c73cbe3b62a418757c98372f90ccd31338f59f24d34d709da8302: Status 404 returned error can't find the container with id 109b5bc9cd6c73cbe3b62a418757c98372f90ccd31338f59f24d34d709da8302 Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.014675 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.024861 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.025247 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.52522886 +0000 UTC m=+147.563750752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.036037 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6rz2q"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.079442 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-56bkw"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.109011 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.109063 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.130765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.131320 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.631300513 +0000 UTC m=+147.669822405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: W1122 02:55:11.157225 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd09df5_3f39_4aec_8ff9_35050ec873bd.slice/crio-ee199ac08488d498a16305c12742ed4b224a45585a713ca0b0b335467813d4cb WatchSource:0}: Error finding container ee199ac08488d498a16305c12742ed4b224a45585a713ca0b0b335467813d4cb: Status 404 returned error can't find the container with id ee199ac08488d498a16305c12742ed4b224a45585a713ca0b0b335467813d4cb Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.237729 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.237918 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.737900949 +0000 UTC m=+147.776422831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.238339 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.238692 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.738684577 +0000 UTC m=+147.777206469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.284340 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.343671 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.344453 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.844430063 +0000 UTC m=+147.882951945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.378236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" event={"ID":"5289b0a6-6c1f-4fb4-972f-552899994896","Type":"ContainerStarted","Data":"11feb7cda18afa78d5075c3f8d01a5605f1a8cc6ff0b4cbee0f02d16053dd8b6"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.383390 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.395264 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.411320 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cm895"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.414782 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" event={"ID":"b6ef9de0-f249-438a-94c0-0a359bd88889","Type":"ContainerStarted","Data":"ecc9da719984f5b32c97ae9644ef95b3f13a9fa899f77c47fab5fdf6f774332e"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.430921 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" event={"ID":"e4cd7913-ec11-4aa6-9351-23821b3cfbcd","Type":"ContainerStarted","Data":"aa34e6922a31690d30a2fd1c9099e5cb01b01a3140ba58ef984ce149fb3628f8"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.435523 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dj4jp" event={"ID":"136fdcc5-9b23-442a-85e0-96129d4aed8a","Type":"ContainerStarted","Data":"42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.435583 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dj4jp" event={"ID":"136fdcc5-9b23-442a-85e0-96129d4aed8a","Type":"ContainerStarted","Data":"8e2d5811f57236b93e366312e677d44ccabdad91a99fa9faff20af0552392f53"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.441915 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.442702 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" event={"ID":"c6aceb43-ebd5-4a06-b702-e8236bc7de2d","Type":"ContainerStarted","Data":"39e25fe84a8e33e18e29f8276d1bfe2337f5a0586232eeb30839db072d2ee397"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.446826 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.447586 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:11.947571729 +0000 UTC m=+147.986093621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.465931 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" event={"ID":"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3","Type":"ContainerStarted","Data":"b94b34543a28f1f2f66accb69b05dac69693ae5ad9d82b756375da59ffa0748a"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.470035 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" event={"ID":"0bd09df5-3f39-4aec-8ff9-35050ec873bd","Type":"ContainerStarted","Data":"ee199ac08488d498a16305c12742ed4b224a45585a713ca0b0b335467813d4cb"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.474011 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jb5md" event={"ID":"2c02715b-5f00-464a-85e3-4df3043304d6","Type":"ContainerStarted","Data":"68626fb3b5a9fbfce3b879a4226cb59b57213b5f74aeac70232dc679a66589cd"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.480188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-n88n6" event={"ID":"0f5e24eb-19ec-4a6e-9b72-ded8f180b673","Type":"ContainerStarted","Data":"7554ed7fe771201c3fe16dd306d8055e984e5a861798c493b55c1d61486d85e9"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.480276 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-n88n6" event={"ID":"0f5e24eb-19ec-4a6e-9b72-ded8f180b673","Type":"ContainerStarted","Data":"2317b1532cf8bff732bb2584be5317f8c6c96f64c995502925716f1f6847b436"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.481131 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.483029 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" event={"ID":"564a21fe-1256-4854-b5c1-3d407a4e9aaf","Type":"ContainerStarted","Data":"f03718a4a768f47bd9c09c93aebd32145b98aabf4bd8b44d7038011cd4fe5d6f"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.484899 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" event={"ID":"bc859e82-4e51-4cef-844a-17ada349c77a","Type":"ContainerStarted","Data":"afe9120883a4ca33ebc993e3cccdc5bc3e3a1dd6f385f1792115dcfc38cc78a8"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.486713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" event={"ID":"771457a9-fee5-496f-88be-b6cea42cc92a","Type":"ContainerStarted","Data":"78ae00921c32f1a362f15cb020eb4ddcd77e26e976bed1379bb7859a4a795f88"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.491998 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.499397 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mwb28" event={"ID":"3a67fd92-1760-46af-a0b4-0a52c965c63e","Type":"ContainerStarted","Data":"ca50ddac7dca9a11d0cb41088f8d044c4603c15150539911bdf994cd666c1cb8"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.499589 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.500643 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-s8lrv" event={"ID":"1a9735e4-9fac-4075-afce-68bb38057c12","Type":"ContainerStarted","Data":"875bd9cc716d72eb36d05d988d4c30cc34323da185232c345531578c9de272fd"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.507927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" event={"ID":"9109383b-40f8-49d7-a601-1d048c4d8686","Type":"ContainerStarted","Data":"885568730adfc3e22dec423953e0428b681346785099041bb57270c54728a215"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.510722 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" event={"ID":"e5063ff0-adff-488b-9c11-595be817e952","Type":"ContainerStarted","Data":"e5e88ba2ba1e5e7fdb9a352d3885944ea2730e0bf8dcbbcc494ec84115657904"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.510750 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" event={"ID":"e5063ff0-adff-488b-9c11-595be817e952","Type":"ContainerStarted","Data":"3ce643a419eb04d74a7c33ec5aefd991e0e574281785611995ae8ad21d1f1786"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.512600 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" event={"ID":"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7","Type":"ContainerStarted","Data":"109b5bc9cd6c73cbe3b62a418757c98372f90ccd31338f59f24d34d709da8302"} Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.520284 4922 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-gsfk7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.520328 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" podUID="9507632c-3232-44dc-a75f-1275a2f57145" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.524404 4922 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7zzh7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.524437 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" podUID="f267ec6b-64da-4065-b3aa-2e66ac957118" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.524459 4922 patch_prober.go:28] interesting pod/console-operator-58897d9998-mwb28 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.524529 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mwb28" podUID="3a67fd92-1760-46af-a0b4-0a52c965c63e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.534301 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-n88n6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.534351 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n88n6" podUID="0f5e24eb-19ec-4a6e-9b72-ded8f180b673" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.547411 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.550385 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.550441 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.551338 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.051320678 +0000 UTC m=+148.089842570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.650929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.651897 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x"] Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.652385 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.152369145 +0000 UTC m=+148.190891147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.661661 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hjlwb"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.668723 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z"] Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.696732 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" podStartSLOduration=123.696710401 podStartE2EDuration="2m3.696710401s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:11.693696531 +0000 UTC m=+147.732218423" watchObservedRunningTime="2025-11-22 02:55:11.696710401 +0000 UTC m=+147.735232313" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.738459 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5v9d" podStartSLOduration=123.738439406 podStartE2EDuration="2m3.738439406s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:11.735677962 +0000 UTC m=+147.774199854" watchObservedRunningTime="2025-11-22 02:55:11.738439406 +0000 UTC m=+147.776961298" Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.752612 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.752790 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.252757907 +0000 UTC m=+148.291279799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.753181 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.753479 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.253465764 +0000 UTC m=+148.291987656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.857811 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.858673 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.358645567 +0000 UTC m=+148.397167459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:11 crc kubenswrapper[4922]: W1122 02:55:11.868921 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513c0c01_5777_4fb2_b36d_d35251f0c33e.slice/crio-d7f20234f8359598cef040f3504e2093988b94d93759af5dc605771bd8644322 WatchSource:0}: Error finding container d7f20234f8359598cef040f3504e2093988b94d93759af5dc605771bd8644322: Status 404 returned error can't find the container with id d7f20234f8359598cef040f3504e2093988b94d93759af5dc605771bd8644322 Nov 22 02:55:11 crc kubenswrapper[4922]: W1122 02:55:11.873094 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62cd80a8_6faf_48b7_bf44_3b181afd66c6.slice/crio-474a0509c11f64db9ef3d6fdc8eb8630378b2a4202cff914c7e67ea3cf5b22c3 WatchSource:0}: Error finding container 474a0509c11f64db9ef3d6fdc8eb8630378b2a4202cff914c7e67ea3cf5b22c3: Status 404 returned error can't find the container with id 474a0509c11f64db9ef3d6fdc8eb8630378b2a4202cff914c7e67ea3cf5b22c3 Nov 22 02:55:11 crc kubenswrapper[4922]: I1122 02:55:11.959551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:11 crc kubenswrapper[4922]: E1122 02:55:11.959900 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.459889008 +0000 UTC m=+148.498410900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.063203 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.064063 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.564035606 +0000 UTC m=+148.602557498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.064165 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.064788 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.564780394 +0000 UTC m=+148.603302286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.120642 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.152215 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.163389 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wxkjb"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.170522 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.171037 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.171444 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.671429691 +0000 UTC m=+148.709951583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.187222 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.197835 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.210038 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7bmdv"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.212039 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-j2nbv"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.272293 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.273278 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.773263736 +0000 UTC m=+148.811785628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.282121 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ndkq8" podStartSLOduration=124.28210628 podStartE2EDuration="2m4.28210628s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.281235101 +0000 UTC m=+148.319756993" watchObservedRunningTime="2025-11-22 02:55:12.28210628 +0000 UTC m=+148.320628172" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.282729 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-skmr7" podStartSLOduration=124.282724585 podStartE2EDuration="2m4.282724585s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.22197679 +0000 UTC m=+148.260498682" watchObservedRunningTime="2025-11-22 02:55:12.282724585 +0000 UTC m=+148.321246477" Nov 22 02:55:12 crc kubenswrapper[4922]: W1122 02:55:12.309611 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f825ae_ce20_4eb1_a7dc_1468a552f2d0.slice/crio-ddb594cd24cecf1b4b7f20b05943901a5b62a7e5b4fa51be81de17e446449f43 WatchSource:0}: Error finding container ddb594cd24cecf1b4b7f20b05943901a5b62a7e5b4fa51be81de17e446449f43: Status 404 returned error can't find the container with id ddb594cd24cecf1b4b7f20b05943901a5b62a7e5b4fa51be81de17e446449f43 Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.310599 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gjqfl"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.330074 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jb5md" podStartSLOduration=124.330055239 podStartE2EDuration="2m4.330055239s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.329082387 +0000 UTC m=+148.367604279" watchObservedRunningTime="2025-11-22 02:55:12.330055239 +0000 UTC m=+148.368577131" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.363853 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.373404 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.374050 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.874028537 +0000 UTC m=+148.912550429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.385648 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.421482 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" podStartSLOduration=124.421461724 podStartE2EDuration="2m4.421461724s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.408823822 +0000 UTC m=+148.447345714" watchObservedRunningTime="2025-11-22 02:55:12.421461724 +0000 UTC m=+148.459983616" Nov 22 02:55:12 crc kubenswrapper[4922]: W1122 02:55:12.458649 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4238f8cf_8e49_464b_9b6b_3f93b015747b.slice/crio-07a90288104a4ddfbb8b1b33bc2c2ece976460fa97947edae89c3715140d48c7 WatchSource:0}: Error finding container 07a90288104a4ddfbb8b1b33bc2c2ece976460fa97947edae89c3715140d48c7: Status 404 returned error can't find the container with id 07a90288104a4ddfbb8b1b33bc2c2ece976460fa97947edae89c3715140d48c7 Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.475939 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.476345 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:12.976326003 +0000 UTC m=+149.014847895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.499723 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:12 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:12 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:12 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.499801 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.506259 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4kqxp"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.513720 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv"] Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.519993 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-n88n6" podStartSLOduration=124.519951582 podStartE2EDuration="2m4.519951582s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.4580515 +0000 UTC m=+148.496573392" watchObservedRunningTime="2025-11-22 02:55:12.519951582 +0000 UTC m=+148.558473474" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.544337 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mwb28" podStartSLOduration=124.544305265 podStartE2EDuration="2m4.544305265s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.522526091 +0000 UTC m=+148.561047983" watchObservedRunningTime="2025-11-22 02:55:12.544305265 +0000 UTC m=+148.582827157" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.563666 4922 generic.go:334] "Generic (PLEG): container finished" podID="564a21fe-1256-4854-b5c1-3d407a4e9aaf" containerID="431bedf6d972197d3af5edb19bae38c352382a59b160971d5ea7a4ae5eae9f8f" exitCode=0 Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.564374 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" event={"ID":"564a21fe-1256-4854-b5c1-3d407a4e9aaf","Type":"ContainerDied","Data":"431bedf6d972197d3af5edb19bae38c352382a59b160971d5ea7a4ae5eae9f8f"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.566901 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" podStartSLOduration=124.566882918 podStartE2EDuration="2m4.566882918s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.56005318 +0000 UTC m=+148.598575072" watchObservedRunningTime="2025-11-22 02:55:12.566882918 +0000 UTC m=+148.605404800" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.576950 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.577419 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.0773855 +0000 UTC m=+149.115907402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.586314 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" event={"ID":"bc859e82-4e51-4cef-844a-17ada349c77a","Type":"ContainerStarted","Data":"de48746455b0ff113524fb575e1212fc046c55937f69da91f790f80b918858bc"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.596550 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" event={"ID":"a907e262-3b15-4019-a715-c45b0e12ac27","Type":"ContainerStarted","Data":"770cea73fc1038afe4b1ac634df576ac8f81ab1c45561fe881d8d32c854fc42b"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.602723 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dj4jp" podStartSLOduration=124.602698975 podStartE2EDuration="2m4.602698975s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.601909178 +0000 UTC m=+148.640431080" watchObservedRunningTime="2025-11-22 02:55:12.602698975 +0000 UTC m=+148.641220867" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.629703 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" event={"ID":"870b8608-cc44-4883-bc2f-a889dddf467e","Type":"ContainerStarted","Data":"de22cb628d35eedfa10e8447a89dca700ffa83a24ef667fd5eafc35c59eff421"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.635799 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" event={"ID":"f267ec6b-64da-4065-b3aa-2e66ac957118","Type":"ContainerStarted","Data":"2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.659787 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" event={"ID":"c6aceb43-ebd5-4a06-b702-e8236bc7de2d","Type":"ContainerStarted","Data":"d0c6e3791d4faf0dd83126b3c722ee92b885c5167e1ce437bc2d1e821ea5be52"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.687246 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" event={"ID":"62cd80a8-6faf-48b7-bf44-3b181afd66c6","Type":"ContainerStarted","Data":"474a0509c11f64db9ef3d6fdc8eb8630378b2a4202cff914c7e67ea3cf5b22c3"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.687298 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.688394 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.188376817 +0000 UTC m=+149.226898909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.716332 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" event={"ID":"a96d295d-cf93-4a96-ac7d-cc85ed8221da","Type":"ContainerStarted","Data":"97895f936a7334402e0c654a8634df4f04fc5f6baf0bd5b9d48fcf1263587167"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.716373 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" event={"ID":"a96d295d-cf93-4a96-ac7d-cc85ed8221da","Type":"ContainerStarted","Data":"22d7da896c0071d02a2071e6533c5fb2ca42cf0d1ca1a2f071818902164af22b"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.738211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" event={"ID":"70f825ae-ce20-4eb1-a7dc-1468a552f2d0","Type":"ContainerStarted","Data":"ddb594cd24cecf1b4b7f20b05943901a5b62a7e5b4fa51be81de17e446449f43"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.762953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jb5md" event={"ID":"2c02715b-5f00-464a-85e3-4df3043304d6","Type":"ContainerStarted","Data":"506492de99bfd0b3ee24395451aee790c88e90bcd00208775f09e4bd0ed722bf"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.793648 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.794096 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.294065072 +0000 UTC m=+149.332586964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.794411 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.796439 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.296420146 +0000 UTC m=+149.334942038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.798199 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" event={"ID":"25a50f57-bc62-4fc0-8410-fc80535adb55","Type":"ContainerStarted","Data":"c4391a8d87701783b870a5a7ac197523c6f22961bcbc4f999cf90277f556d0fc"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.810022 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" event={"ID":"89932301-e045-432f-9b1e-d88d5c420fdf","Type":"ContainerStarted","Data":"dcaf12ec6991699755e490a7b87c03075e870bf7981faf27ffc34efb5de5776c"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.846777 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" event={"ID":"151b3d80-db1e-4af2-aa89-b38d9cfe8bea","Type":"ContainerStarted","Data":"84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.846874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" event={"ID":"151b3d80-db1e-4af2-aa89-b38d9cfe8bea","Type":"ContainerStarted","Data":"d77a2a86629edb810209eef0d767251a7b24ea6e2e2d2ee86327bd96b13e921d"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.847181 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.857504 4922 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ksllc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.857567 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" podUID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.873738 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" podStartSLOduration=124.873716214 podStartE2EDuration="2m4.873716214s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.872565667 +0000 UTC m=+148.911087559" watchObservedRunningTime="2025-11-22 02:55:12.873716214 +0000 UTC m=+148.912238106" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.874889 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vmg24" podStartSLOduration=124.87488155 podStartE2EDuration="2m4.87488155s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.837176059 +0000 UTC m=+148.875697971" watchObservedRunningTime="2025-11-22 02:55:12.87488155 +0000 UTC m=+148.913403442" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.878147 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" event={"ID":"771457a9-fee5-496f-88be-b6cea42cc92a","Type":"ContainerStarted","Data":"8a6b43c2064f561b3828a4cad217d97f6354d522fc229977010adbb3b03cd6e3"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.902795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.903056 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.403032302 +0000 UTC m=+149.441554274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.903227 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:12 crc kubenswrapper[4922]: E1122 02:55:12.904326 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.404308362 +0000 UTC m=+149.442830254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.929251 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" event={"ID":"c93e87aa-7ece-4e0d-9c17-e4bc0b5bf0a3","Type":"ContainerStarted","Data":"2c0c1bef6f7296c918e34601d6660e0ed5933fc1c94d7b60a86731a46d805163"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.930073 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" podStartSLOduration=124.930054827 podStartE2EDuration="2m4.930054827s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.929114255 +0000 UTC m=+148.967636147" watchObservedRunningTime="2025-11-22 02:55:12.930054827 +0000 UTC m=+148.968576719" Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.972545 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" event={"ID":"5289b0a6-6c1f-4fb4-972f-552899994896","Type":"ContainerStarted","Data":"a4002565388d08c53141731e399d358b9c151c2aee55e813df99f344a5fc1a24"} Nov 22 02:55:12 crc kubenswrapper[4922]: I1122 02:55:12.991116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7bmdv" event={"ID":"4238f8cf-8e49-464b-9b6b-3f93b015747b","Type":"ContainerStarted","Data":"07a90288104a4ddfbb8b1b33bc2c2ece976460fa97947edae89c3715140d48c7"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.011433 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.013182 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.513148209 +0000 UTC m=+149.551670101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.019805 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" event={"ID":"e84c0c00-7fd0-44eb-b719-57fca6572497","Type":"ContainerStarted","Data":"6d865cfa26db8c1d057be118490e01b43b5c2fe79edf633bd1d22ccb56bac27a"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.036943 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7fgw4" podStartSLOduration=125.036920098 podStartE2EDuration="2m5.036920098s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:12.97082945 +0000 UTC m=+149.009351342" watchObservedRunningTime="2025-11-22 02:55:13.036920098 +0000 UTC m=+149.075441980" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.038033 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" podStartSLOduration=125.038025044 podStartE2EDuration="2m5.038025044s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.035290671 +0000 UTC m=+149.073812563" watchObservedRunningTime="2025-11-22 02:55:13.038025044 +0000 UTC m=+149.076546936" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.052418 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" event={"ID":"fd5c7d8c-ecfb-40af-990a-7bb86b045533","Type":"ContainerStarted","Data":"68708d8454b968621ba9d12cd9d7a01f5e16fa6b2abecb699af87e30bd5d1bc1"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.059659 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lns87" podStartSLOduration=125.059638984 podStartE2EDuration="2m5.059638984s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.059121562 +0000 UTC m=+149.097643454" watchObservedRunningTime="2025-11-22 02:55:13.059638984 +0000 UTC m=+149.098160876" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.070813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" event={"ID":"936d6833-0d47-4fdd-957a-b85f6dcd186c","Type":"ContainerStarted","Data":"d7d057dad4118fa2c8006acb86301db3b6b94f15024086a5986bdce257ef20df"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.098914 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-p6ktw" podStartSLOduration=125.098891532 podStartE2EDuration="2m5.098891532s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.096994738 +0000 UTC m=+149.135516630" watchObservedRunningTime="2025-11-22 02:55:13.098891532 +0000 UTC m=+149.137413424" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.101950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qqgzd" event={"ID":"9109383b-40f8-49d7-a601-1d048c4d8686","Type":"ContainerStarted","Data":"9b23449cda1d75a348c9928e55cf5e508d486edc75a7b193f6bc61990f72ceb2"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.113071 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.115393 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.615381863 +0000 UTC m=+149.653903755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.140100 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" event={"ID":"ca284dd6-f710-44d3-aa30-a493af7ff84f","Type":"ContainerStarted","Data":"167fe2af282b60d4e0f37b505bed84ce99ca4fdfb465d724372dab86dc18e1dd"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.190941 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" event={"ID":"513c0c01-5777-4fb2-b36d-d35251f0c33e","Type":"ContainerStarted","Data":"d7f20234f8359598cef040f3504e2093988b94d93759af5dc605771bd8644322"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.203698 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" podStartSLOduration=125.203673845 podStartE2EDuration="2m5.203673845s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.192913007 +0000 UTC m=+149.231434899" watchObservedRunningTime="2025-11-22 02:55:13.203673845 +0000 UTC m=+149.242195737" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.204295 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" event={"ID":"c9739869-8d63-4aaf-8f03-1605eed08ceb","Type":"ContainerStarted","Data":"f398ca63310528a49926fe99115a392d7e771f60f45227610963977a55b9af06"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.213977 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.215439 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.715420787 +0000 UTC m=+149.753942679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.231302 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" event={"ID":"156920b9-f91f-4053-be05-3be9c55f09b1","Type":"ContainerStarted","Data":"2d4ff02e389d8c499c92a70af4d463ed2adc19aabea98fdde7ad99fa15f8b7f6"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.232421 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.245264 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" event={"ID":"859cd128-2a8a-48e3-b3af-eb0147864b94","Type":"ContainerStarted","Data":"352be757b7e9bece879578832e2ccb790db68b934cb7b6eb730b10051e335e78"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.245890 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.247507 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cm895 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.247538 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.261061 4922 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9s7kz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.261149 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" podUID="859cd128-2a8a-48e3-b3af-eb0147864b94" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.261336 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.280031 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" event={"ID":"cf220517-af5c-4186-9439-585c793c64e3","Type":"ContainerStarted","Data":"be31c7e6900748fc50498195d26a0360aa771e46890cc0fedb4dba6b815a0acd"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.280257 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" event={"ID":"cf220517-af5c-4186-9439-585c793c64e3","Type":"ContainerStarted","Data":"3974be0bbb15bcc4d9ce87226ea6925dd2831478f2143b9e675667e224446a83"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.327712 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" podStartSLOduration=125.327686314 podStartE2EDuration="2m5.327686314s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.298293434 +0000 UTC m=+149.336815326" watchObservedRunningTime="2025-11-22 02:55:13.327686314 +0000 UTC m=+149.366208196" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.331830 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" event={"ID":"e5063ff0-adff-488b-9c11-595be817e952","Type":"ContainerStarted","Data":"6220205f9929037b3b2687a83155ca06e838734956fac1953ad963d4165fb653"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.334186 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.365569 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.865537089 +0000 UTC m=+149.904058981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.379088 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d25rh" podStartSLOduration=125.379067033 podStartE2EDuration="2m5.379067033s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.37854986 +0000 UTC m=+149.417071752" watchObservedRunningTime="2025-11-22 02:55:13.379067033 +0000 UTC m=+149.417588925" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.435161 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" event={"ID":"dc1629dd-242e-43e0-9aaa-8ed4e24cb8c7","Type":"ContainerStarted","Data":"15be3f1026400e727c163afc3791488d8a77a327924d172a3329b4b302a6e1b2"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.439680 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.439735 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.939721045 +0000 UTC m=+149.978242927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.449666 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.451147 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:13.951124359 +0000 UTC m=+149.989646251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.466234 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" podStartSLOduration=125.466213188 podStartE2EDuration="2m5.466213188s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.464486928 +0000 UTC m=+149.503008820" watchObservedRunningTime="2025-11-22 02:55:13.466213188 +0000 UTC m=+149.504735070" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.490276 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-s8lrv" event={"ID":"1a9735e4-9fac-4075-afce-68bb38057c12","Type":"ContainerStarted","Data":"e7a7fa02a9ec6ef9e2acf87acb2cba0c835890ef61f7f09dfecc1c428f811f25"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.502327 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:13 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:13 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:13 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.502391 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.512260 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7lpgh" podStartSLOduration=125.512234012 podStartE2EDuration="2m5.512234012s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.510045741 +0000 UTC m=+149.548567643" watchObservedRunningTime="2025-11-22 02:55:13.512234012 +0000 UTC m=+149.550755904" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.520932 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-n88n6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.521009 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n88n6" podUID="0f5e24eb-19ec-4a6e-9b72-ded8f180b673" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.521110 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" event={"ID":"2dc01606-1661-41db-9a31-949d68a1548e","Type":"ContainerStarted","Data":"5b2a3f5250d0f468354c026931d3c46a8db7d89685d43883658e4c5f51892afc"} Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.554382 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.556206 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.056188509 +0000 UTC m=+150.094710401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.564692 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lvltf" podStartSLOduration=125.564668385 podStartE2EDuration="2m5.564668385s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.562669339 +0000 UTC m=+149.601191231" watchObservedRunningTime="2025-11-22 02:55:13.564668385 +0000 UTC m=+149.603190277" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.572590 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mwb28" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.608629 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-s8lrv" podStartSLOduration=6.608599831 podStartE2EDuration="6.608599831s" podCreationTimestamp="2025-11-22 02:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:13.59431349 +0000 UTC m=+149.632835412" watchObservedRunningTime="2025-11-22 02:55:13.608599831 +0000 UTC m=+149.647121723" Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.660573 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.663697 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.163647184 +0000 UTC m=+150.202169076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.764062 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.764472 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.264449536 +0000 UTC m=+150.302971438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.872962 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.873421 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.373405206 +0000 UTC m=+150.411927098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.974018 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.974244 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.474197547 +0000 UTC m=+150.512719439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:13 crc kubenswrapper[4922]: I1122 02:55:13.974728 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:13 crc kubenswrapper[4922]: E1122 02:55:13.975147 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.475139038 +0000 UTC m=+150.513660930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.088590 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.092078 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.592047673 +0000 UTC m=+150.630569565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.194238 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.195086 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.695070335 +0000 UTC m=+150.733592227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.295750 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.296269 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.796244856 +0000 UTC m=+150.834766748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.397544 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.398130 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.898109511 +0000 UTC m=+150.936631393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.498373 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.498838 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:14.998821531 +0000 UTC m=+151.037343423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.499073 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:14 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:14 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:14 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.499107 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.504745 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.505190 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.541463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" event={"ID":"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f","Type":"ContainerStarted","Data":"024beac7b23d601c940bf58c9f3abab9862b801d829aa9988305d392b2459708"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.541527 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" event={"ID":"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f","Type":"ContainerStarted","Data":"33c221eea24abf9f085ef3e56172e00c53df38ba13a9c96d784d4f9f49cfaa7c"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.546040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4kqxp" event={"ID":"abb64422-1cc8-4858-a3fa-ca843be8687c","Type":"ContainerStarted","Data":"e619d4b9ef951f8193f0ec40e5d53995b25f05f034ed277d6efc25210ad66f34"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.546069 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4kqxp" event={"ID":"abb64422-1cc8-4858-a3fa-ca843be8687c","Type":"ContainerStarted","Data":"00d981a0196810c5ec05adcb4e144462e2738ddcc35a9824c0078e245ccbccbf"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.553769 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" event={"ID":"62cd80a8-6faf-48b7-bf44-3b181afd66c6","Type":"ContainerStarted","Data":"1a02aa4ed7703c286944df7289f5b9298b1fba546f12978ec2ae554a52385559"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.558454 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" event={"ID":"513c0c01-5777-4fb2-b36d-d35251f0c33e","Type":"ContainerStarted","Data":"8eff9ebd0a00432a4bc8130465c58f324b13801f59f4459c8def24dbc1adeca8"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.558511 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" event={"ID":"513c0c01-5777-4fb2-b36d-d35251f0c33e","Type":"ContainerStarted","Data":"14b19599c6040a58be7b752dad6183eb74e6e598726f1ceb704d0c82603250ae"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.562053 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" event={"ID":"89932301-e045-432f-9b1e-d88d5c420fdf","Type":"ContainerStarted","Data":"b1020d64df7cf14b0b77338382fac9666217f06bf2e27ae82ec99ed6f66dc2ed"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.569469 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" event={"ID":"2dc01606-1661-41db-9a31-949d68a1548e","Type":"ContainerStarted","Data":"bcb0dc87676b20b642ec55b0097adc28139558238bdaa97b4089ef5071d91a70"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.569529 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" event={"ID":"2dc01606-1661-41db-9a31-949d68a1548e","Type":"ContainerStarted","Data":"5e7a0dc9cf3f86e2d012286655417ca5affa072765ecff08e64ac0ede0592979"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.569814 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.580257 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" podStartSLOduration=126.580238443 podStartE2EDuration="2m6.580238443s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.578945894 +0000 UTC m=+150.617467786" watchObservedRunningTime="2025-11-22 02:55:14.580238443 +0000 UTC m=+150.618760335" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.595295 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" event={"ID":"859cd128-2a8a-48e3-b3af-eb0147864b94","Type":"ContainerStarted","Data":"bf3d32ef2d7c9183280476836f6ab8c6936e9116539d28817391bfbf7c3670dc"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.598612 4922 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9s7kz container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.598678 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" podUID="859cd128-2a8a-48e3-b3af-eb0147864b94" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.599654 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.600126 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.100113634 +0000 UTC m=+151.138635526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.632309 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" event={"ID":"70f825ae-ce20-4eb1-a7dc-1468a552f2d0","Type":"ContainerStarted","Data":"8ce5467bbb792ee5957058364906a26393268b522de2d44ac61c4d2c1b6da380"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.632358 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" event={"ID":"70f825ae-ce20-4eb1-a7dc-1468a552f2d0","Type":"ContainerStarted","Data":"d3c4191a72f2cf39ecf83393032f44b7deee6abf72a633c28d2d822c17fffaea"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.654347 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7bmdv" event={"ID":"4238f8cf-8e49-464b-9b6b-3f93b015747b","Type":"ContainerStarted","Data":"fbcd24e251fc0a46ea8dbb887341594cbf9eead4a69a444a73f59f754b921c2a"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.656107 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" event={"ID":"25a50f57-bc62-4fc0-8410-fc80535adb55","Type":"ContainerStarted","Data":"ba5a08255b807e040de07120dd6c24d75949cf447f55b73a37f360652e17a5f2"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.656142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" event={"ID":"25a50f57-bc62-4fc0-8410-fc80535adb55","Type":"ContainerStarted","Data":"2bb69b50b1cc2d56a22c9e644985e636e5df8088c27248f1e98b9d99ca21249c"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.662079 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4kqxp" podStartSLOduration=7.662064207 podStartE2EDuration="7.662064207s" podCreationTimestamp="2025-11-22 02:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.616953143 +0000 UTC m=+150.655475035" watchObservedRunningTime="2025-11-22 02:55:14.662064207 +0000 UTC m=+150.700586099" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.662874 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" podStartSLOduration=126.662869966 podStartE2EDuration="2m6.662869966s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.661189676 +0000 UTC m=+150.699711568" watchObservedRunningTime="2025-11-22 02:55:14.662869966 +0000 UTC m=+150.701391858" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.683927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" event={"ID":"156920b9-f91f-4053-be05-3be9c55f09b1","Type":"ContainerStarted","Data":"4343efaf8b9adfbf4ae9a84598f357867c86b637dbdb7eabb66afbcbbfc57ab7"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.695212 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cm895 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.695264 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.697506 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-j2nbv" podStartSLOduration=126.697469145 podStartE2EDuration="2m6.697469145s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.69679048 +0000 UTC m=+150.735312372" watchObservedRunningTime="2025-11-22 02:55:14.697469145 +0000 UTC m=+150.735991037" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.702879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" event={"ID":"564a21fe-1256-4854-b5c1-3d407a4e9aaf","Type":"ContainerStarted","Data":"fb7fa19ea3a4e3195a411817dbd831065cf8c977d1c2bc74611322f5e4a2fb27"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.703508 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.704897 4922 generic.go:334] "Generic (PLEG): container finished" podID="e84c0c00-7fd0-44eb-b719-57fca6572497" containerID="425a1ed4405daab30e5c49e0390de62a6167532565a6519ff3a741421432a5ad" exitCode=0 Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.704938 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" event={"ID":"e84c0c00-7fd0-44eb-b719-57fca6572497","Type":"ContainerStarted","Data":"5f3a6c2c70e127697eda7e9bae899ea8f5f98da09e151f31e6baea67d4e443d6"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.704953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" event={"ID":"e84c0c00-7fd0-44eb-b719-57fca6572497","Type":"ContainerDied","Data":"425a1ed4405daab30e5c49e0390de62a6167532565a6519ff3a741421432a5ad"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.706658 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" event={"ID":"ca284dd6-f710-44d3-aa30-a493af7ff84f","Type":"ContainerStarted","Data":"990fd50b5912a48ea923a2de0de11349f16e2f8221b38174c4f0391a48bb6deb"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.706668 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.726319 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.226275822 +0000 UTC m=+151.264797764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.731489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5gzxz" event={"ID":"0bd09df5-3f39-4aec-8ff9-35050ec873bd","Type":"ContainerStarted","Data":"892aaecda7cc702f10814aa81463bd88b02893a7b7d895e39f4c22ef8c68e0b9"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.738128 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-86q9x" event={"ID":"fd5c7d8c-ecfb-40af-990a-7bb86b045533","Type":"ContainerStarted","Data":"bccb32aac5fd99f02aaa62f4dcb49c8a02c96b6e090d7c3648f3017ac47223d5"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.777161 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hjlwb" podStartSLOduration=126.777135668 podStartE2EDuration="2m6.777135668s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.776497754 +0000 UTC m=+150.815019646" watchObservedRunningTime="2025-11-22 02:55:14.777135668 +0000 UTC m=+150.815657560" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.797719 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" event={"ID":"870b8608-cc44-4883-bc2f-a889dddf467e","Type":"ContainerStarted","Data":"f9422e9d80974559d1dcf06f94cf994a6b034aa5eed6bc176dd725548eea975e"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.834384 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.846874 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" event={"ID":"936d6833-0d47-4fdd-957a-b85f6dcd186c","Type":"ContainerStarted","Data":"2323a30ba688c5321a7266f659aaf265d4c0879fdd625bc832e0aa2264533ec3"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.848132 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.849067 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" event={"ID":"c9739869-8d63-4aaf-8f03-1605eed08ceb","Type":"ContainerStarted","Data":"2b5db348a1517fb36d425b026eaf9bff1d800195b3c14151fd46a7ab481214cc"} Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.849288 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.349273637 +0000 UTC m=+151.387795529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.849646 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.857167 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xlb7z" podStartSLOduration=126.857128938 podStartE2EDuration="2m6.857128938s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.835445187 +0000 UTC m=+150.873967109" watchObservedRunningTime="2025-11-22 02:55:14.857128938 +0000 UTC m=+150.895650830" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.864929 4922 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-n89mb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.864995 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" podUID="c9739869-8d63-4aaf-8f03-1605eed08ceb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.866557 4922 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4hrnn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.866583 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" podUID="936d6833-0d47-4fdd-957a-b85f6dcd186c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.869696 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" event={"ID":"bc859e82-4e51-4cef-844a-17ada349c77a","Type":"ContainerStarted","Data":"e3fa4f448ad1565a00a54e9ac740322b5871ad3781d65256f960a87b05f4c7dc"} Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.889629 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gjqfl" podStartSLOduration=126.889600789 podStartE2EDuration="2m6.889600789s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.877317465 +0000 UTC m=+150.915839357" watchObservedRunningTime="2025-11-22 02:55:14.889600789 +0000 UTC m=+150.928122681" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.916161 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.938594 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.939046 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.439010942 +0000 UTC m=+151.477532834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:14 crc kubenswrapper[4922]: I1122 02:55:14.945582 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:14 crc kubenswrapper[4922]: E1122 02:55:14.945880 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.445865291 +0000 UTC m=+151.484387183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.037386 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n5wk8" podStartSLOduration=127.037360047 podStartE2EDuration="2m7.037360047s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:14.947266733 +0000 UTC m=+150.985788625" watchObservedRunningTime="2025-11-22 02:55:15.037360047 +0000 UTC m=+151.075881939" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.047788 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.049601 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.54958375 +0000 UTC m=+151.588105642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.089783 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nxqd9" podStartSLOduration=127.089762478 podStartE2EDuration="2m7.089762478s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.082440879 +0000 UTC m=+151.120962771" watchObservedRunningTime="2025-11-22 02:55:15.089762478 +0000 UTC m=+151.128284370" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.092512 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9jfnp" podStartSLOduration=127.092502012 podStartE2EDuration="2m7.092502012s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.037306175 +0000 UTC m=+151.075828067" watchObservedRunningTime="2025-11-22 02:55:15.092502012 +0000 UTC m=+151.131023904" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.153635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.153691 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.154035 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.654023025 +0000 UTC m=+151.692544917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.158988 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.181328 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" podStartSLOduration=127.181304136 podStartE2EDuration="2m7.181304136s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.178348688 +0000 UTC m=+151.216870580" watchObservedRunningTime="2025-11-22 02:55:15.181304136 +0000 UTC m=+151.219826028" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.254633 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.254956 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.254998 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.255051 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.255699 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.755668936 +0000 UTC m=+151.794190828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.265234 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.297107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.298971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.356215 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.356688 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.856672432 +0000 UTC m=+151.895194324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.362355 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.363383 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.374915 4922 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-bfqj8 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.374979 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" podUID="e84c0c00-7fd0-44eb-b719-57fca6572497" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.20:8443/livez\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.401514 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" podStartSLOduration=127.401487799 podStartE2EDuration="2m7.401487799s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.316315088 +0000 UTC m=+151.354836970" watchObservedRunningTime="2025-11-22 02:55:15.401487799 +0000 UTC m=+151.440009691" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.419111 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.435186 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.441646 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.459417 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.459769 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:15.959751797 +0000 UTC m=+151.998273689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.497722 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" podStartSLOduration=127.497703084 podStartE2EDuration="2m7.497703084s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.408154543 +0000 UTC m=+151.446676435" watchObservedRunningTime="2025-11-22 02:55:15.497703084 +0000 UTC m=+151.536224966" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.532554 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:15 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:15 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:15 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.532602 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.561855 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.562275 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.062257837 +0000 UTC m=+152.100779729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.578271 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" podStartSLOduration=127.578233416 podStartE2EDuration="2m7.578233416s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.501358248 +0000 UTC m=+151.539880140" watchObservedRunningTime="2025-11-22 02:55:15.578233416 +0000 UTC m=+151.616755308" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.640010 4922 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mmrpx container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]log ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]etcd ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/generic-apiserver-start-informers ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/max-in-flight-filter ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 22 02:55:15 crc kubenswrapper[4922]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 22 02:55:15 crc kubenswrapper[4922]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/project.openshift.io-projectcache ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/openshift.io-startinformers ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 22 02:55:15 crc kubenswrapper[4922]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 22 02:55:15 crc kubenswrapper[4922]: livez check failed Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.640574 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" podUID="a96d295d-cf93-4a96-ac7d-cc85ed8221da" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.664192 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.664579 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.164557313 +0000 UTC m=+152.203079205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.766710 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.767098 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.267082094 +0000 UTC m=+152.305603986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.801903 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6rz2q" podStartSLOduration=127.801881549 podStartE2EDuration="2m7.801881549s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:15.799269359 +0000 UTC m=+151.837791251" watchObservedRunningTime="2025-11-22 02:55:15.801881549 +0000 UTC m=+151.840403441" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.868461 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.868877 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.368860109 +0000 UTC m=+152.407382001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.939134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" event={"ID":"a907e262-3b15-4019-a715-c45b0e12ac27","Type":"ContainerStarted","Data":"9cf69e00cb645b1bee7946db7eef06c6d1f6a0fc02fd082b5269763ce638caf6"} Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.971981 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:15 crc kubenswrapper[4922]: E1122 02:55:15.972517 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.472498106 +0000 UTC m=+152.511019998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.985023 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7bmdv" event={"ID":"4238f8cf-8e49-464b-9b6b-3f93b015747b","Type":"ContainerStarted","Data":"8fa567d039845e11c03797026baaf5e8d336ec99576b6f1a156e9dd78bd4d114"} Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.990130 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.990422 4922 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cm895 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Nov 22 02:55:15 crc kubenswrapper[4922]: I1122 02:55:15.990468 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.021346 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s7kz" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.039975 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7bmdv" podStartSLOduration=9.039957886 podStartE2EDuration="9.039957886s" podCreationTimestamp="2025-11-22 02:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:16.038084113 +0000 UTC m=+152.076606005" watchObservedRunningTime="2025-11-22 02:55:16.039957886 +0000 UTC m=+152.078479778" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.055672 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-n89mb" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.073099 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.075516 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.575482647 +0000 UTC m=+152.614004679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.178274 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.178715 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.678700575 +0000 UTC m=+152.717222467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.283405 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.284019 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.78400188 +0000 UTC m=+152.822523772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.385064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.385478 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.885463267 +0000 UTC m=+152.923985149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.492653 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.493635 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:16.993590008 +0000 UTC m=+153.032111900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.500718 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:16 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:16 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:16 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.500792 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.594434 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.594775 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.094763928 +0000 UTC m=+153.133285820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: W1122 02:55:16.657163 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-3dc01a0d9e64bbe8db074302bb0ed12efb2c100de8aba2bca5000fe063087048 WatchSource:0}: Error finding container 3dc01a0d9e64bbe8db074302bb0ed12efb2c100de8aba2bca5000fe063087048: Status 404 returned error can't find the container with id 3dc01a0d9e64bbe8db074302bb0ed12efb2c100de8aba2bca5000fe063087048 Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.696682 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.697112 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.197072304 +0000 UTC m=+153.235594196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.798735 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.799301 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.299283748 +0000 UTC m=+153.337805640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.900919 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.901191 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.401145754 +0000 UTC m=+153.439667646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.901613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:16 crc kubenswrapper[4922]: E1122 02:55:16.902020 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.402008244 +0000 UTC m=+153.440530136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.971377 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h8sq5"] Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.972608 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.977417 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.991023 4922 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4hrnn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 02:55:16 crc kubenswrapper[4922]: I1122 02:55:16.991079 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" podUID="936d6833-0d47-4fdd-957a-b85f6dcd186c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.002793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.003210 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.503196194 +0000 UTC m=+153.541718086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.007549 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2d481d9d607f8a3fc707e260e025596d4c44eb474b7c0ad826dbcf1c9332e6ae"} Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.010272 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8sq5"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.019620 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3dc01a0d9e64bbe8db074302bb0ed12efb2c100de8aba2bca5000fe063087048"} Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.034834 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-56bkw" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.037043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e73453cd8bda89fcc014b98f1cc4daa9b4ec6ab073561e18a4f0f20880e141a1"} Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.048296 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.057025 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4hrnn" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.104501 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.104602 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-utilities\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.104663 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmwqz\" (UniqueName: \"kubernetes.io/projected/6d590121-2d31-483c-9c86-14b40c9d23ad-kube-api-access-dmwqz\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.104707 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-catalog-content\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.107270 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.607253061 +0000 UTC m=+153.645774953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.204389 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t48nh"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.206146 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.206399 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-utilities\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.206448 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmwqz\" (UniqueName: \"kubernetes.io/projected/6d590121-2d31-483c-9c86-14b40c9d23ad-kube-api-access-dmwqz\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.206467 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-catalog-content\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.207101 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-catalog-content\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.209148 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.209547 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.709533227 +0000 UTC m=+153.748055119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.209795 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-utilities\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.225428 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.227374 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t48nh"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.268923 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmwqz\" (UniqueName: \"kubernetes.io/projected/6d590121-2d31-483c-9c86-14b40c9d23ad-kube-api-access-dmwqz\") pod \"certified-operators-h8sq5\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.310021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.310107 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26v49\" (UniqueName: \"kubernetes.io/projected/81c367c0-8b10-4ce9-aa76-290449c7df39-kube-api-access-26v49\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.310186 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-utilities\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.310211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-catalog-content\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.310729 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.810709297 +0000 UTC m=+153.849231189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.388485 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pwg2w"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.389778 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.412566 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.412896 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.413212 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:17.913184077 +0000 UTC m=+153.951706169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.412906 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26v49\" (UniqueName: \"kubernetes.io/projected/81c367c0-8b10-4ce9-aa76-290449c7df39-kube-api-access-26v49\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.413358 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-utilities\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.413380 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-catalog-content\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.413922 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-catalog-content\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.413954 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-utilities\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.420889 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwg2w"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.459969 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26v49\" (UniqueName: \"kubernetes.io/projected/81c367c0-8b10-4ce9-aa76-290449c7df39-kube-api-access-26v49\") pod \"community-operators-t48nh\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.500403 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:17 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:17 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:17 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.500461 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.514979 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.515086 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-catalog-content\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.515127 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrvk\" (UniqueName: \"kubernetes.io/projected/8e0bcf47-633a-44d1-82b8-90cdf74fa610-kube-api-access-kbrvk\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.515174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-utilities\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.515516 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.015502144 +0000 UTC m=+154.054024036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.556090 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vgckp"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.563427 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.589756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.594466 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgckp"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.626970 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.629376 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.129337686 +0000 UTC m=+154.167859578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.629631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-utilities\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.629761 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-catalog-content\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.629828 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7knj\" (UniqueName: \"kubernetes.io/projected/9e5138dd-6039-4020-969e-6a30b33be2b5-kube-api-access-n7knj\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.629930 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.630541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-utilities\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.630675 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-catalog-content\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.630783 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.130768969 +0000 UTC m=+154.169290861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.630900 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-utilities\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.631174 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-catalog-content\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.641234 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrvk\" (UniqueName: \"kubernetes.io/projected/8e0bcf47-633a-44d1-82b8-90cdf74fa610-kube-api-access-kbrvk\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.671010 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrvk\" (UniqueName: \"kubernetes.io/projected/8e0bcf47-633a-44d1-82b8-90cdf74fa610-kube-api-access-kbrvk\") pod \"certified-operators-pwg2w\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.715303 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.743145 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.743403 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.243361254 +0000 UTC m=+154.281883156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.743446 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-catalog-content\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.743497 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7knj\" (UniqueName: \"kubernetes.io/projected/9e5138dd-6039-4020-969e-6a30b33be2b5-kube-api-access-n7knj\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.743527 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.743586 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-utilities\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.744047 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-catalog-content\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.744404 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-utilities\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.744932 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.24491459 +0000 UTC m=+154.283436482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.768789 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h8sq5"] Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.770094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7knj\" (UniqueName: \"kubernetes.io/projected/9e5138dd-6039-4020-969e-6a30b33be2b5-kube-api-access-n7knj\") pod \"community-operators-vgckp\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.845317 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.846743 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.346718814 +0000 UTC m=+154.385240706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.879751 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t48nh"] Nov 22 02:55:17 crc kubenswrapper[4922]: W1122 02:55:17.892533 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c367c0_8b10_4ce9_aa76_290449c7df39.slice/crio-5c17d39ea7d0a421b8a86382a2730433358a35ae500d99138cf9baf7d6dc68e0 WatchSource:0}: Error finding container 5c17d39ea7d0a421b8a86382a2730433358a35ae500d99138cf9baf7d6dc68e0: Status 404 returned error can't find the container with id 5c17d39ea7d0a421b8a86382a2730433358a35ae500d99138cf9baf7d6dc68e0 Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.914485 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:55:17 crc kubenswrapper[4922]: I1122 02:55:17.947049 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:17 crc kubenswrapper[4922]: E1122 02:55:17.947468 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.447452614 +0000 UTC m=+154.485974506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.012984 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pwg2w"] Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.052507 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.052757 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.552708979 +0000 UTC m=+154.591230871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.052931 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.053925 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.553913696 +0000 UTC m=+154.592435588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.066219 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e8eab041342210d636e28b3847a04f47436107b0ea8618a7ddb02bb9188ecd39"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.068823 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" event={"ID":"a907e262-3b15-4019-a715-c45b0e12ac27","Type":"ContainerStarted","Data":"3f97370586028b3bc5e6db2a1de9e7bcefd2b78c500702a117a1391b4216a545"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.072192 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ffc8a926cc0548bcd5e14febbd4af9dba0c112f0d89d9720693e205208ac9569"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.074134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerStarted","Data":"66140b440ffe6cf4718bad228bf26dbc3dad92ec5b56c1a06b6349d953317f32"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.074171 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerStarted","Data":"5c17d39ea7d0a421b8a86382a2730433358a35ae500d99138cf9baf7d6dc68e0"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.075634 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.078366 4922 generic.go:334] "Generic (PLEG): container finished" podID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerID="0437c01f59fb1ca95f6f70e75a196ccc33af4fb651d9a7df01093d73e391a42c" exitCode=0 Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.078434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8sq5" event={"ID":"6d590121-2d31-483c-9c86-14b40c9d23ad","Type":"ContainerDied","Data":"0437c01f59fb1ca95f6f70e75a196ccc33af4fb651d9a7df01093d73e391a42c"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.078460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8sq5" event={"ID":"6d590121-2d31-483c-9c86-14b40c9d23ad","Type":"ContainerStarted","Data":"93a744045948568508b5ffe24d5609a53ea8c737ec83296a34ee7369725698c7"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.089074 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"974d074977ba93317d7a25598806cde2d0767b066d667cac20ff57278d0560cd"} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.089117 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:18 crc kubenswrapper[4922]: W1122 02:55:18.120945 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0bcf47_633a_44d1_82b8_90cdf74fa610.slice/crio-1c9b8765a3fd2f0dce4d476ef2378e62fb0f9bba891c42739eb0e8b677f74015 WatchSource:0}: Error finding container 1c9b8765a3fd2f0dce4d476ef2378e62fb0f9bba891c42739eb0e8b677f74015: Status 404 returned error can't find the container with id 1c9b8765a3fd2f0dce4d476ef2378e62fb0f9bba891c42739eb0e8b677f74015 Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.164166 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.165612 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.665579329 +0000 UTC m=+154.704101221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.207333 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgckp"] Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.266011 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.269771 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.769750389 +0000 UTC m=+154.808272281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.273324 4922 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.376799 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.377026 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.876986499 +0000 UTC m=+154.915508391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.377091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.377668 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.877656855 +0000 UTC m=+154.916178747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.478426 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.478682 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.97864109 +0000 UTC m=+155.017162982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.478801 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.479327 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 02:55:18.979305075 +0000 UTC m=+155.017826987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5hhlh" (UID: "14638a05-727a-441a-88f2-f9750aa17a39") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.499674 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:18 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:18 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:18 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.499795 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.579902 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:18 crc kubenswrapper[4922]: E1122 02:55:18.580349 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 02:55:19.08026932 +0000 UTC m=+155.118791212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.583201 4922 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-22T02:55:18.274615541Z","Handler":null,"Name":""} Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.587425 4922 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.587866 4922 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.681886 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.686040 4922 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.686083 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.726537 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5hhlh\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.783149 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.792387 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.808759 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.957658 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p5clf"] Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.963046 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.979599 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 02:55:18 crc kubenswrapper[4922]: I1122 02:55:18.987697 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p5clf"] Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.074814 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5hhlh"] Nov 22 02:55:19 crc kubenswrapper[4922]: W1122 02:55:19.081983 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14638a05_727a_441a_88f2_f9750aa17a39.slice/crio-cf78329aa32e055fc46de43fdfc0c9dc23b3d4b1ab33ecb87d23043a2f3a4c5c WatchSource:0}: Error finding container cf78329aa32e055fc46de43fdfc0c9dc23b3d4b1ab33ecb87d23043a2f3a4c5c: Status 404 returned error can't find the container with id cf78329aa32e055fc46de43fdfc0c9dc23b3d4b1ab33ecb87d23043a2f3a4c5c Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.090224 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-catalog-content\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.090261 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-utilities\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.090322 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwhg7\" (UniqueName: \"kubernetes.io/projected/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-kube-api-access-qwhg7\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.105827 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" event={"ID":"a907e262-3b15-4019-a715-c45b0e12ac27","Type":"ContainerStarted","Data":"dbd06181ae1d6ff807c75d9bcb326e7e51d569ae33cd5849eccd3d7e5103e5b2"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.105913 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" event={"ID":"a907e262-3b15-4019-a715-c45b0e12ac27","Type":"ContainerStarted","Data":"12bf3cd9003493e3ba44e874bc23803a0e97ed346944e7675192d6bd937ed576"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.110013 4922 generic.go:334] "Generic (PLEG): container finished" podID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerID="8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d" exitCode=0 Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.110301 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgckp" event={"ID":"9e5138dd-6039-4020-969e-6a30b33be2b5","Type":"ContainerDied","Data":"8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.110358 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgckp" event={"ID":"9e5138dd-6039-4020-969e-6a30b33be2b5","Type":"ContainerStarted","Data":"db59237923319cdd77327cbf28d6854f79046e97e7acae81bd0791cea2db0598"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.112567 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" event={"ID":"14638a05-727a-441a-88f2-f9750aa17a39","Type":"ContainerStarted","Data":"cf78329aa32e055fc46de43fdfc0c9dc23b3d4b1ab33ecb87d23043a2f3a4c5c"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.115592 4922 generic.go:334] "Generic (PLEG): container finished" podID="a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" containerID="024beac7b23d601c940bf58c9f3abab9862b801d829aa9988305d392b2459708" exitCode=0 Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.115666 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" event={"ID":"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f","Type":"ContainerDied","Data":"024beac7b23d601c940bf58c9f3abab9862b801d829aa9988305d392b2459708"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.126113 4922 generic.go:334] "Generic (PLEG): container finished" podID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerID="66140b440ffe6cf4718bad228bf26dbc3dad92ec5b56c1a06b6349d953317f32" exitCode=0 Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.126214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerDied","Data":"66140b440ffe6cf4718bad228bf26dbc3dad92ec5b56c1a06b6349d953317f32"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.129883 4922 generic.go:334] "Generic (PLEG): container finished" podID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerID="f198c8fef76695ca81f120890c1708414f11bdd9bb8fd80fa08358116078d8a6" exitCode=0 Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.130194 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwg2w" event={"ID":"8e0bcf47-633a-44d1-82b8-90cdf74fa610","Type":"ContainerDied","Data":"f198c8fef76695ca81f120890c1708414f11bdd9bb8fd80fa08358116078d8a6"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.130272 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwg2w" event={"ID":"8e0bcf47-633a-44d1-82b8-90cdf74fa610","Type":"ContainerStarted","Data":"1c9b8765a3fd2f0dce4d476ef2378e62fb0f9bba891c42739eb0e8b677f74015"} Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.132188 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wxkjb" podStartSLOduration=12.132166136 podStartE2EDuration="12.132166136s" podCreationTimestamp="2025-11-22 02:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:19.130079927 +0000 UTC m=+155.168601819" watchObservedRunningTime="2025-11-22 02:55:19.132166136 +0000 UTC m=+155.170688028" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.191980 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwhg7\" (UniqueName: \"kubernetes.io/projected/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-kube-api-access-qwhg7\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.192053 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-catalog-content\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.192078 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-utilities\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.192694 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-utilities\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.192978 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-catalog-content\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.225310 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwhg7\" (UniqueName: \"kubernetes.io/projected/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-kube-api-access-qwhg7\") pod \"redhat-marketplace-p5clf\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.307198 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.314398 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.359190 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5d2"] Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.360199 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.380572 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5d2"] Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.496122 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-catalog-content\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.496599 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-utilities\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.496628 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldl7t\" (UniqueName: \"kubernetes.io/projected/93a35959-93aa-4e0e-a171-802b0161fe5c-kube-api-access-ldl7t\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.496448 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:19 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:19 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:19 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.496738 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.509263 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.513410 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mmrpx" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.526392 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.600586 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-utilities\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.600655 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldl7t\" (UniqueName: \"kubernetes.io/projected/93a35959-93aa-4e0e-a171-802b0161fe5c-kube-api-access-ldl7t\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.600776 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-catalog-content\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.601448 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-catalog-content\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.601514 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-utilities\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.633638 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldl7t\" (UniqueName: \"kubernetes.io/projected/93a35959-93aa-4e0e-a171-802b0161fe5c-kube-api-access-ldl7t\") pod \"redhat-marketplace-tb5d2\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.638627 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p5clf"] Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.685281 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:55:19 crc kubenswrapper[4922]: I1122 02:55:19.983561 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5d2"] Nov 22 02:55:20 crc kubenswrapper[4922]: W1122 02:55:20.011311 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93a35959_93aa_4e0e_a171_802b0161fe5c.slice/crio-e5d50f964d5676eaf1f3a8143f5fd2b0d18efb2a97659d815ffb745372bf8cae WatchSource:0}: Error finding container e5d50f964d5676eaf1f3a8143f5fd2b0d18efb2a97659d815ffb745372bf8cae: Status 404 returned error can't find the container with id e5d50f964d5676eaf1f3a8143f5fd2b0d18efb2a97659d815ffb745372bf8cae Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.023373 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.024700 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.031509 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.031525 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.048149 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.064631 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-n88n6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.064690 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-n88n6" podUID="0f5e24eb-19ec-4a6e-9b72-ded8f180b673" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.065637 4922 patch_prober.go:28] interesting pod/downloads-7954f5f757-n88n6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.065670 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-n88n6" podUID="0f5e24eb-19ec-4a6e-9b72-ded8f180b673" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.23:8080/\": dial tcp 10.217.0.23:8080: connect: connection refused" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.109792 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/022b1992-84d3-45ee-ab66-32d93b5aefce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.109916 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/022b1992-84d3-45ee-ab66-32d93b5aefce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.116327 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.116406 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.120106 4922 patch_prober.go:28] interesting pod/console-f9d7485db-dj4jp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.120154 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dj4jp" podUID="136fdcc5-9b23-442a-85e0-96129d4aed8a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.140439 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5d2" event={"ID":"93a35959-93aa-4e0e-a171-802b0161fe5c","Type":"ContainerStarted","Data":"e5d50f964d5676eaf1f3a8143f5fd2b0d18efb2a97659d815ffb745372bf8cae"} Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.164128 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t5m25"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.171288 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.187604 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5m25"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.192406 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.201468 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" event={"ID":"14638a05-727a-441a-88f2-f9750aa17a39","Type":"ContainerStarted","Data":"b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f"} Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.201661 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.206807 4922 generic.go:334] "Generic (PLEG): container finished" podID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerID="c90d10e537cc2cd12c5df2446276db764c0f47653d5755dbc8cd030e22d29177" exitCode=0 Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.206908 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p5clf" event={"ID":"24fb7ac2-dd84-402b-9c03-b0c29174fa6c","Type":"ContainerDied","Data":"c90d10e537cc2cd12c5df2446276db764c0f47653d5755dbc8cd030e22d29177"} Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.206961 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p5clf" event={"ID":"24fb7ac2-dd84-402b-9c03-b0c29174fa6c","Type":"ContainerStarted","Data":"cc382ecf9a92b22cca0a3f2688aea2bfc0395de880bdae43925ba3962a9c93dc"} Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.210858 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/022b1992-84d3-45ee-ab66-32d93b5aefce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.210925 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/022b1992-84d3-45ee-ab66-32d93b5aefce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.211258 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/022b1992-84d3-45ee-ab66-32d93b5aefce-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.272864 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/022b1992-84d3-45ee-ab66-32d93b5aefce-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.283198 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" podStartSLOduration=132.283179258 podStartE2EDuration="2m12.283179258s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:20.282980752 +0000 UTC m=+156.321502644" watchObservedRunningTime="2025-11-22 02:55:20.283179258 +0000 UTC m=+156.321701140" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.312188 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-catalog-content\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.312281 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whrr7\" (UniqueName: \"kubernetes.io/projected/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-kube-api-access-whrr7\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.312367 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-utilities\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.364827 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.369191 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.376622 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bfqj8" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.414091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-catalog-content\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.414155 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whrr7\" (UniqueName: \"kubernetes.io/projected/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-kube-api-access-whrr7\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.414195 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-utilities\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.415503 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-catalog-content\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.416267 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-utilities\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.462336 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whrr7\" (UniqueName: \"kubernetes.io/projected/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-kube-api-access-whrr7\") pod \"redhat-operators-t5m25\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.490733 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.497643 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.506665 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:20 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:20 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:20 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.506729 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.572384 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkqw4"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.573780 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.591225 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkqw4"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.639203 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.721034 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mvx2\" (UniqueName: \"kubernetes.io/projected/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-kube-api-access-2mvx2\") pod \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.721177 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-config-volume\") pod \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.721263 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-secret-volume\") pod \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\" (UID: \"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f\") " Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.721448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-catalog-content\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.721493 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rhr4\" (UniqueName: \"kubernetes.io/projected/737abb10-2a68-40ba-a6c0-201ae0619ede-kube-api-access-5rhr4\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.721613 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-utilities\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.723044 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" (UID: "a5e7f9ae-8050-4602-b4f3-53a73ec0b60f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.728512 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" (UID: "a5e7f9ae-8050-4602-b4f3-53a73ec0b60f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.729238 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-kube-api-access-2mvx2" (OuterVolumeSpecName: "kube-api-access-2mvx2") pod "a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" (UID: "a5e7f9ae-8050-4602-b4f3-53a73ec0b60f"). InnerVolumeSpecName "kube-api-access-2mvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.827641 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-catalog-content\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.827708 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rhr4\" (UniqueName: \"kubernetes.io/projected/737abb10-2a68-40ba-a6c0-201ae0619ede-kube-api-access-5rhr4\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.827802 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-utilities\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.827858 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.827871 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mvx2\" (UniqueName: \"kubernetes.io/projected/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-kube-api-access-2mvx2\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.827881 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.828621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-catalog-content\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.832137 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-utilities\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.852552 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rhr4\" (UniqueName: \"kubernetes.io/projected/737abb10-2a68-40ba-a6c0-201ae0619ede-kube-api-access-5rhr4\") pod \"redhat-operators-xkqw4\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.871723 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 02:55:20 crc kubenswrapper[4922]: I1122 02:55:20.932215 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.141595 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t5m25"] Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.261372 4922 generic.go:334] "Generic (PLEG): container finished" podID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerID="c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082" exitCode=0 Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.261472 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5d2" event={"ID":"93a35959-93aa-4e0e-a171-802b0161fe5c","Type":"ContainerDied","Data":"c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082"} Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.274132 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"022b1992-84d3-45ee-ab66-32d93b5aefce","Type":"ContainerStarted","Data":"272d96644e6d822aebc51e727383d2820034904809af87413499b03ba58d2aa1"} Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.284046 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" event={"ID":"a5e7f9ae-8050-4602-b4f3-53a73ec0b60f","Type":"ContainerDied","Data":"33c221eea24abf9f085ef3e56172e00c53df38ba13a9c96d784d4f9f49cfaa7c"} Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.284132 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33c221eea24abf9f085ef3e56172e00c53df38ba13a9c96d784d4f9f49cfaa7c" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.284249 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.353885 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5m25" event={"ID":"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506","Type":"ContainerStarted","Data":"627d4df432e58721e6155267a27eb70229dcbceddb7ff494826c8c25e6e37e5e"} Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.380803 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 02:55:21 crc kubenswrapper[4922]: E1122 02:55:21.381777 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" containerName="collect-profiles" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.381796 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" containerName="collect-profiles" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.381973 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" containerName="collect-profiles" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.382433 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.389203 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.389431 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.394300 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.502175 4922 patch_prober.go:28] interesting pod/router-default-5444994796-jb5md container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 02:55:21 crc kubenswrapper[4922]: [-]has-synced failed: reason withheld Nov 22 02:55:21 crc kubenswrapper[4922]: [+]process-running ok Nov 22 02:55:21 crc kubenswrapper[4922]: healthz check failed Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.502262 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jb5md" podUID="2c02715b-5f00-464a-85e3-4df3043304d6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.548805 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/200c2995-b401-489c-a63c-016f6e186dd6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.548907 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/200c2995-b401-489c-a63c-016f6e186dd6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.604153 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkqw4"] Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.650184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/200c2995-b401-489c-a63c-016f6e186dd6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.650294 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/200c2995-b401-489c-a63c-016f6e186dd6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.650383 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/200c2995-b401-489c-a63c-016f6e186dd6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.702891 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/200c2995-b401-489c-a63c-016f6e186dd6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:21 crc kubenswrapper[4922]: I1122 02:55:21.807032 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.346795 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"022b1992-84d3-45ee-ab66-32d93b5aefce","Type":"ContainerStarted","Data":"f76cee3b2154cb6d3528aa01f8015a9f062335578f6f7d484f0a528a9677a2e0"} Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.365630 4922 generic.go:334] "Generic (PLEG): container finished" podID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerID="f82e4117235df77da823aecaba5d19a40cedbfdec0178a24ea3e5d2492dbe2d4" exitCode=0 Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.366353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkqw4" event={"ID":"737abb10-2a68-40ba-a6c0-201ae0619ede","Type":"ContainerDied","Data":"f82e4117235df77da823aecaba5d19a40cedbfdec0178a24ea3e5d2492dbe2d4"} Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.366381 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkqw4" event={"ID":"737abb10-2a68-40ba-a6c0-201ae0619ede","Type":"ContainerStarted","Data":"e2f8726b88f7cf411f417c2c65e8f3e46038fc089586e63ace5feaa3b0872259"} Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.368227 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.368213222 podStartE2EDuration="2.368213222s" podCreationTimestamp="2025-11-22 02:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:22.36683395 +0000 UTC m=+158.405355842" watchObservedRunningTime="2025-11-22 02:55:22.368213222 +0000 UTC m=+158.406735114" Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.371891 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerID="740d53a0c2f2741103c5571d667f9a427af264181bcfe988cc46ad4e1531cb54" exitCode=0 Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.371934 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5m25" event={"ID":"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506","Type":"ContainerDied","Data":"740d53a0c2f2741103c5571d667f9a427af264181bcfe988cc46ad4e1531cb54"} Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.499111 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.505294 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jb5md" Nov 22 02:55:22 crc kubenswrapper[4922]: I1122 02:55:22.542660 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 02:55:23 crc kubenswrapper[4922]: I1122 02:55:23.398490 4922 generic.go:334] "Generic (PLEG): container finished" podID="022b1992-84d3-45ee-ab66-32d93b5aefce" containerID="f76cee3b2154cb6d3528aa01f8015a9f062335578f6f7d484f0a528a9677a2e0" exitCode=0 Nov 22 02:55:23 crc kubenswrapper[4922]: I1122 02:55:23.398721 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"022b1992-84d3-45ee-ab66-32d93b5aefce","Type":"ContainerDied","Data":"f76cee3b2154cb6d3528aa01f8015a9f062335578f6f7d484f0a528a9677a2e0"} Nov 22 02:55:23 crc kubenswrapper[4922]: I1122 02:55:23.403342 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"200c2995-b401-489c-a63c-016f6e186dd6","Type":"ContainerStarted","Data":"3deffdcc015a6d45bccf81c62eadc73501592f3b3a40eff7ee998bfc2765e4bc"} Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.418189 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"200c2995-b401-489c-a63c-016f6e186dd6","Type":"ContainerStarted","Data":"fa48e46ce0de2df6101c21ed6b37588051da63a3cf7b352c2d06ba0c4097b2d5"} Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.433240 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.433211894 podStartE2EDuration="3.433211894s" podCreationTimestamp="2025-11-22 02:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:55:24.43307125 +0000 UTC m=+160.471593152" watchObservedRunningTime="2025-11-22 02:55:24.433211894 +0000 UTC m=+160.471733786" Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.838025 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.915917 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/022b1992-84d3-45ee-ab66-32d93b5aefce-kubelet-dir\") pod \"022b1992-84d3-45ee-ab66-32d93b5aefce\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.916087 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/022b1992-84d3-45ee-ab66-32d93b5aefce-kube-api-access\") pod \"022b1992-84d3-45ee-ab66-32d93b5aefce\" (UID: \"022b1992-84d3-45ee-ab66-32d93b5aefce\") " Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.919014 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/022b1992-84d3-45ee-ab66-32d93b5aefce-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "022b1992-84d3-45ee-ab66-32d93b5aefce" (UID: "022b1992-84d3-45ee-ab66-32d93b5aefce"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 02:55:24 crc kubenswrapper[4922]: I1122 02:55:24.946326 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/022b1992-84d3-45ee-ab66-32d93b5aefce-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "022b1992-84d3-45ee-ab66-32d93b5aefce" (UID: "022b1992-84d3-45ee-ab66-32d93b5aefce"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.018889 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/022b1992-84d3-45ee-ab66-32d93b5aefce-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.019047 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/022b1992-84d3-45ee-ab66-32d93b5aefce-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.473131 4922 generic.go:334] "Generic (PLEG): container finished" podID="200c2995-b401-489c-a63c-016f6e186dd6" containerID="fa48e46ce0de2df6101c21ed6b37588051da63a3cf7b352c2d06ba0c4097b2d5" exitCode=0 Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.473265 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"200c2995-b401-489c-a63c-016f6e186dd6","Type":"ContainerDied","Data":"fa48e46ce0de2df6101c21ed6b37588051da63a3cf7b352c2d06ba0c4097b2d5"} Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.483650 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"022b1992-84d3-45ee-ab66-32d93b5aefce","Type":"ContainerDied","Data":"272d96644e6d822aebc51e727383d2820034904809af87413499b03ba58d2aa1"} Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.483709 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272d96644e6d822aebc51e727383d2820034904809af87413499b03ba58d2aa1" Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.483786 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 02:55:25 crc kubenswrapper[4922]: I1122 02:55:25.631809 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7bmdv" Nov 22 02:55:30 crc kubenswrapper[4922]: I1122 02:55:30.071660 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-n88n6" Nov 22 02:55:30 crc kubenswrapper[4922]: I1122 02:55:30.120109 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:30 crc kubenswrapper[4922]: I1122 02:55:30.123990 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 02:55:31 crc kubenswrapper[4922]: I1122 02:55:31.039142 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:31 crc kubenswrapper[4922]: I1122 02:55:31.059913 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5c8000a-a783-474f-a73a-55814c257a02-metrics-certs\") pod \"network-metrics-daemon-2gmkj\" (UID: \"d5c8000a-a783-474f-a73a-55814c257a02\") " pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:31 crc kubenswrapper[4922]: I1122 02:55:31.327133 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2gmkj" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.500483 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.566181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"200c2995-b401-489c-a63c-016f6e186dd6","Type":"ContainerDied","Data":"3deffdcc015a6d45bccf81c62eadc73501592f3b3a40eff7ee998bfc2765e4bc"} Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.566740 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3deffdcc015a6d45bccf81c62eadc73501592f3b3a40eff7ee998bfc2765e4bc" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.567008 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.679162 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/200c2995-b401-489c-a63c-016f6e186dd6-kube-api-access\") pod \"200c2995-b401-489c-a63c-016f6e186dd6\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.679268 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/200c2995-b401-489c-a63c-016f6e186dd6-kubelet-dir\") pod \"200c2995-b401-489c-a63c-016f6e186dd6\" (UID: \"200c2995-b401-489c-a63c-016f6e186dd6\") " Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.679459 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/200c2995-b401-489c-a63c-016f6e186dd6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "200c2995-b401-489c-a63c-016f6e186dd6" (UID: "200c2995-b401-489c-a63c-016f6e186dd6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.679767 4922 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/200c2995-b401-489c-a63c-016f6e186dd6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.686680 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200c2995-b401-489c-a63c-016f6e186dd6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "200c2995-b401-489c-a63c-016f6e186dd6" (UID: "200c2995-b401-489c-a63c-016f6e186dd6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:55:33 crc kubenswrapper[4922]: I1122 02:55:33.782017 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/200c2995-b401-489c-a63c-016f6e186dd6-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 02:55:35 crc kubenswrapper[4922]: I1122 02:55:35.897531 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2gmkj"] Nov 22 02:55:38 crc kubenswrapper[4922]: I1122 02:55:38.816333 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 02:55:41 crc kubenswrapper[4922]: I1122 02:55:41.109395 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:55:41 crc kubenswrapper[4922]: I1122 02:55:41.109976 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:55:41 crc kubenswrapper[4922]: W1122 02:55:41.958476 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c8000a_a783_474f_a73a_55814c257a02.slice/crio-1ef2a89cee5588871280850e000fcdf45ac22b22de6df95e7946d89d88ac61c7 WatchSource:0}: Error finding container 1ef2a89cee5588871280850e000fcdf45ac22b22de6df95e7946d89d88ac61c7: Status 404 returned error can't find the container with id 1ef2a89cee5588871280850e000fcdf45ac22b22de6df95e7946d89d88ac61c7 Nov 22 02:55:42 crc kubenswrapper[4922]: I1122 02:55:42.618960 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" event={"ID":"d5c8000a-a783-474f-a73a-55814c257a02","Type":"ContainerStarted","Data":"1ef2a89cee5588871280850e000fcdf45ac22b22de6df95e7946d89d88ac61c7"} Nov 22 02:55:50 crc kubenswrapper[4922]: I1122 02:55:50.778684 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4vw74" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.074688 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.074954 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmwqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-h8sq5_openshift-marketplace(6d590121-2d31-483c-9c86-14b40c9d23ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.076390 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-h8sq5" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.088725 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.088990 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kbrvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-pwg2w_openshift-marketplace(8e0bcf47-633a-44d1-82b8-90cdf74fa610): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.090214 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-pwg2w" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.814704 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.814931 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldl7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tb5d2_openshift-marketplace(93a35959-93aa-4e0e-a171-802b0161fe5c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.816444 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tb5d2" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.835261 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-pwg2w" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" Nov 22 02:55:53 crc kubenswrapper[4922]: E1122 02:55:53.835327 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-h8sq5" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" Nov 22 02:55:54 crc kubenswrapper[4922]: E1122 02:55:54.462812 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 02:55:54 crc kubenswrapper[4922]: E1122 02:55:54.463087 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26v49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-t48nh_openshift-marketplace(81c367c0-8b10-4ce9-aa76-290449c7df39): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:55:54 crc kubenswrapper[4922]: E1122 02:55:54.464349 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-t48nh" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" Nov 22 02:55:55 crc kubenswrapper[4922]: I1122 02:55:55.444251 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 02:55:57 crc kubenswrapper[4922]: E1122 02:55:57.579132 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tb5d2" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" Nov 22 02:55:57 crc kubenswrapper[4922]: E1122 02:55:57.579130 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-t48nh" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" Nov 22 02:55:59 crc kubenswrapper[4922]: E1122 02:55:59.123259 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 02:55:59 crc kubenswrapper[4922]: E1122 02:55:59.123452 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5rhr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xkqw4_openshift-marketplace(737abb10-2a68-40ba-a6c0-201ae0619ede): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:55:59 crc kubenswrapper[4922]: E1122 02:55:59.125017 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xkqw4" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.665901 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xkqw4" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.782164 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.782823 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n7knj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vgckp_openshift-marketplace(9e5138dd-6039-4020-969e-6a30b33be2b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.784066 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vgckp" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.797325 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.797567 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whrr7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t5m25_openshift-marketplace(ffc1fad9-16d3-4a1d-83b7-0ffa9796d506): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 02:56:03 crc kubenswrapper[4922]: E1122 02:56:03.799124 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t5m25" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" Nov 22 02:56:04 crc kubenswrapper[4922]: I1122 02:56:04.760474 4922 generic.go:334] "Generic (PLEG): container finished" podID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerID="94350a78648e6411b09238a3c99d2420901cde6b6f7fa6c980bc6bb18f75b91e" exitCode=0 Nov 22 02:56:04 crc kubenswrapper[4922]: I1122 02:56:04.760578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p5clf" event={"ID":"24fb7ac2-dd84-402b-9c03-b0c29174fa6c","Type":"ContainerDied","Data":"94350a78648e6411b09238a3c99d2420901cde6b6f7fa6c980bc6bb18f75b91e"} Nov 22 02:56:04 crc kubenswrapper[4922]: I1122 02:56:04.771007 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" event={"ID":"d5c8000a-a783-474f-a73a-55814c257a02","Type":"ContainerStarted","Data":"cee994d88bdf809eedeed00606f45548af74051ba57f829c1dc99fa1fb65f32e"} Nov 22 02:56:04 crc kubenswrapper[4922]: I1122 02:56:04.771072 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2gmkj" event={"ID":"d5c8000a-a783-474f-a73a-55814c257a02","Type":"ContainerStarted","Data":"5375a8bcf4186165e787733267abbb42ac9e7712c36c1331f06fdd3332fa9e08"} Nov 22 02:56:04 crc kubenswrapper[4922]: E1122 02:56:04.772592 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t5m25" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" Nov 22 02:56:04 crc kubenswrapper[4922]: E1122 02:56:04.773062 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vgckp" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" Nov 22 02:56:04 crc kubenswrapper[4922]: I1122 02:56:04.847292 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2gmkj" podStartSLOduration=176.847264827 podStartE2EDuration="2m56.847264827s" podCreationTimestamp="2025-11-22 02:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:56:04.844208278 +0000 UTC m=+200.882730180" watchObservedRunningTime="2025-11-22 02:56:04.847264827 +0000 UTC m=+200.885786719" Nov 22 02:56:05 crc kubenswrapper[4922]: I1122 02:56:05.781676 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p5clf" event={"ID":"24fb7ac2-dd84-402b-9c03-b0c29174fa6c","Type":"ContainerStarted","Data":"763af7f08fda0b49cdab64db1c079b79698546641c3a94167177b765565f5d8d"} Nov 22 02:56:05 crc kubenswrapper[4922]: I1122 02:56:05.804492 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p5clf" podStartSLOduration=2.7589818 podStartE2EDuration="47.804468928s" podCreationTimestamp="2025-11-22 02:55:18 +0000 UTC" firstStartedPulling="2025-11-22 02:55:20.211347065 +0000 UTC m=+156.249868957" lastFinishedPulling="2025-11-22 02:56:05.256834163 +0000 UTC m=+201.295356085" observedRunningTime="2025-11-22 02:56:05.802271588 +0000 UTC m=+201.840793490" watchObservedRunningTime="2025-11-22 02:56:05.804468928 +0000 UTC m=+201.842990840" Nov 22 02:56:07 crc kubenswrapper[4922]: I1122 02:56:07.809604 4922 generic.go:334] "Generic (PLEG): container finished" podID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerID="ed691218f4a3e4c3427b3c4d6cc696a3a1b5f09311b9da52ee0e25009e0af5e6" exitCode=0 Nov 22 02:56:07 crc kubenswrapper[4922]: I1122 02:56:07.809739 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8sq5" event={"ID":"6d590121-2d31-483c-9c86-14b40c9d23ad","Type":"ContainerDied","Data":"ed691218f4a3e4c3427b3c4d6cc696a3a1b5f09311b9da52ee0e25009e0af5e6"} Nov 22 02:56:08 crc kubenswrapper[4922]: I1122 02:56:08.825771 4922 generic.go:334] "Generic (PLEG): container finished" podID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerID="5e1d92bcf17a145dac05d300aa5b54c54ef3c7116e52e2c4c5d40a882d4a6fc0" exitCode=0 Nov 22 02:56:08 crc kubenswrapper[4922]: I1122 02:56:08.825886 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwg2w" event={"ID":"8e0bcf47-633a-44d1-82b8-90cdf74fa610","Type":"ContainerDied","Data":"5e1d92bcf17a145dac05d300aa5b54c54ef3c7116e52e2c4c5d40a882d4a6fc0"} Nov 22 02:56:08 crc kubenswrapper[4922]: I1122 02:56:08.837152 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8sq5" event={"ID":"6d590121-2d31-483c-9c86-14b40c9d23ad","Type":"ContainerStarted","Data":"347cf45992cba1a0276f51e25d8edae115792e875f79f58a2a49dc82cd91fae7"} Nov 22 02:56:08 crc kubenswrapper[4922]: I1122 02:56:08.865918 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h8sq5" podStartSLOduration=2.655260359 podStartE2EDuration="52.865892671s" podCreationTimestamp="2025-11-22 02:55:16 +0000 UTC" firstStartedPulling="2025-11-22 02:55:18.08517059 +0000 UTC m=+154.123692482" lastFinishedPulling="2025-11-22 02:56:08.295802862 +0000 UTC m=+204.334324794" observedRunningTime="2025-11-22 02:56:08.865063793 +0000 UTC m=+204.903585705" watchObservedRunningTime="2025-11-22 02:56:08.865892671 +0000 UTC m=+204.904414573" Nov 22 02:56:09 crc kubenswrapper[4922]: I1122 02:56:09.309783 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:56:09 crc kubenswrapper[4922]: I1122 02:56:09.310214 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:56:09 crc kubenswrapper[4922]: I1122 02:56:09.606094 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:56:09 crc kubenswrapper[4922]: I1122 02:56:09.846191 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwg2w" event={"ID":"8e0bcf47-633a-44d1-82b8-90cdf74fa610","Type":"ContainerStarted","Data":"71d47f3f315725f900eb27aa0cb85095e1e92b23e0a1bb056c388cd0f8d6cce5"} Nov 22 02:56:10 crc kubenswrapper[4922]: I1122 02:56:10.855280 4922 generic.go:334] "Generic (PLEG): container finished" podID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerID="4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe" exitCode=0 Nov 22 02:56:10 crc kubenswrapper[4922]: I1122 02:56:10.855308 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5d2" event={"ID":"93a35959-93aa-4e0e-a171-802b0161fe5c","Type":"ContainerDied","Data":"4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe"} Nov 22 02:56:10 crc kubenswrapper[4922]: I1122 02:56:10.883929 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pwg2w" podStartSLOduration=3.7844911420000003 podStartE2EDuration="53.883902703s" podCreationTimestamp="2025-11-22 02:55:17 +0000 UTC" firstStartedPulling="2025-11-22 02:55:19.136971227 +0000 UTC m=+155.175493119" lastFinishedPulling="2025-11-22 02:56:09.236382788 +0000 UTC m=+205.274904680" observedRunningTime="2025-11-22 02:56:10.881163632 +0000 UTC m=+206.919685524" watchObservedRunningTime="2025-11-22 02:56:10.883902703 +0000 UTC m=+206.922424625" Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.110212 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.110300 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.110365 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.111145 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.111335 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6" gracePeriod=600 Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.865461 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6"} Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.865398 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6" exitCode=0 Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.868022 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"5b8b749cd737c0f52e1c945ff82e807bb12860b0ce616473b37087ca3334ae08"} Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.870703 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5d2" event={"ID":"93a35959-93aa-4e0e-a171-802b0161fe5c","Type":"ContainerStarted","Data":"d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed"} Nov 22 02:56:11 crc kubenswrapper[4922]: I1122 02:56:11.907193 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tb5d2" podStartSLOduration=2.92634814 podStartE2EDuration="52.90716183s" podCreationTimestamp="2025-11-22 02:55:19 +0000 UTC" firstStartedPulling="2025-11-22 02:55:21.270025561 +0000 UTC m=+157.308547453" lastFinishedPulling="2025-11-22 02:56:11.250839241 +0000 UTC m=+207.289361143" observedRunningTime="2025-11-22 02:56:11.899971009 +0000 UTC m=+207.938492901" watchObservedRunningTime="2025-11-22 02:56:11.90716183 +0000 UTC m=+207.945683732" Nov 22 02:56:13 crc kubenswrapper[4922]: I1122 02:56:13.892179 4922 generic.go:334] "Generic (PLEG): container finished" podID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerID="dc55d9561147430ee52666bab42be8e3638492f4904b58197387a51f52d1d698" exitCode=0 Nov 22 02:56:13 crc kubenswrapper[4922]: I1122 02:56:13.892475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerDied","Data":"dc55d9561147430ee52666bab42be8e3638492f4904b58197387a51f52d1d698"} Nov 22 02:56:15 crc kubenswrapper[4922]: I1122 02:56:15.908244 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerStarted","Data":"66f64925599505b38d59a69a9c727a3433429335a4596cae4d15bb1ad0af0994"} Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.413435 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.413915 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.455127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.717215 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.717276 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.756755 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.941827 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t48nh" podStartSLOduration=4.260743456 podStartE2EDuration="1m0.941807373s" podCreationTimestamp="2025-11-22 02:55:17 +0000 UTC" firstStartedPulling="2025-11-22 02:55:18.075334742 +0000 UTC m=+154.113856634" lastFinishedPulling="2025-11-22 02:56:14.756398659 +0000 UTC m=+210.794920551" observedRunningTime="2025-11-22 02:56:17.941218 +0000 UTC m=+213.979739902" watchObservedRunningTime="2025-11-22 02:56:17.941807373 +0000 UTC m=+213.980329265" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.962948 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:56:17 crc kubenswrapper[4922]: I1122 02:56:17.966654 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.350667 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.686323 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.686376 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.727626 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.929446 4922 generic.go:334] "Generic (PLEG): container finished" podID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerID="0566626e6cbed8532dae65cfecf73a8904b1dea46287d6dc8b629fd0e662d51b" exitCode=0 Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.929681 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkqw4" event={"ID":"737abb10-2a68-40ba-a6c0-201ae0619ede","Type":"ContainerDied","Data":"0566626e6cbed8532dae65cfecf73a8904b1dea46287d6dc8b629fd0e662d51b"} Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.933175 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerID="0227aa3d9941ff7fed12eff7ecb69b695fbbf959faeb89867825b96241718203" exitCode=0 Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.933247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5m25" event={"ID":"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506","Type":"ContainerDied","Data":"0227aa3d9941ff7fed12eff7ecb69b695fbbf959faeb89867825b96241718203"} Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.935635 4922 generic.go:334] "Generic (PLEG): container finished" podID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerID="2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f" exitCode=0 Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.935825 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgckp" event={"ID":"9e5138dd-6039-4020-969e-6a30b33be2b5","Type":"ContainerDied","Data":"2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f"} Nov 22 02:56:19 crc kubenswrapper[4922]: I1122 02:56:19.986909 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.601277 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwg2w"] Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.601921 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pwg2w" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="registry-server" containerID="cri-o://71d47f3f315725f900eb27aa0cb85095e1e92b23e0a1bb056c388cd0f8d6cce5" gracePeriod=2 Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.945221 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgckp" event={"ID":"9e5138dd-6039-4020-969e-6a30b33be2b5","Type":"ContainerStarted","Data":"951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2"} Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.947318 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkqw4" event={"ID":"737abb10-2a68-40ba-a6c0-201ae0619ede","Type":"ContainerStarted","Data":"b362e5575be2a96d1315ef8a50f3dcc02aaec6355286129b82a6b715d13bdf9f"} Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.949540 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5m25" event={"ID":"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506","Type":"ContainerStarted","Data":"68b21b04d55a18115bdd10e43258ea93509e8dd727fd564cbc02c2e44181b985"} Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.951817 4922 generic.go:334] "Generic (PLEG): container finished" podID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerID="71d47f3f315725f900eb27aa0cb85095e1e92b23e0a1bb056c388cd0f8d6cce5" exitCode=0 Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.952044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwg2w" event={"ID":"8e0bcf47-633a-44d1-82b8-90cdf74fa610","Type":"ContainerDied","Data":"71d47f3f315725f900eb27aa0cb85095e1e92b23e0a1bb056c388cd0f8d6cce5"} Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.970569 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vgckp" podStartSLOduration=2.68265133 podStartE2EDuration="1m3.970542891s" podCreationTimestamp="2025-11-22 02:55:17 +0000 UTC" firstStartedPulling="2025-11-22 02:55:19.111794314 +0000 UTC m=+155.150316206" lastFinishedPulling="2025-11-22 02:56:20.399685875 +0000 UTC m=+216.438207767" observedRunningTime="2025-11-22 02:56:20.970243544 +0000 UTC m=+217.008765436" watchObservedRunningTime="2025-11-22 02:56:20.970542891 +0000 UTC m=+217.009064783" Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.983161 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:56:20 crc kubenswrapper[4922]: I1122 02:56:20.999378 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkqw4" podStartSLOduration=3.098943409 podStartE2EDuration="1m0.999357479s" podCreationTimestamp="2025-11-22 02:55:20 +0000 UTC" firstStartedPulling="2025-11-22 02:55:22.422378794 +0000 UTC m=+158.460900686" lastFinishedPulling="2025-11-22 02:56:20.322792864 +0000 UTC m=+216.361314756" observedRunningTime="2025-11-22 02:56:20.998229764 +0000 UTC m=+217.036751656" watchObservedRunningTime="2025-11-22 02:56:20.999357479 +0000 UTC m=+217.037879371" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.020703 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t5m25" podStartSLOduration=3.024442763 podStartE2EDuration="1m1.020678209s" podCreationTimestamp="2025-11-22 02:55:20 +0000 UTC" firstStartedPulling="2025-11-22 02:55:22.422763124 +0000 UTC m=+158.461285016" lastFinishedPulling="2025-11-22 02:56:20.41899857 +0000 UTC m=+216.457520462" observedRunningTime="2025-11-22 02:56:21.018638743 +0000 UTC m=+217.057160635" watchObservedRunningTime="2025-11-22 02:56:21.020678209 +0000 UTC m=+217.059200111" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.118600 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbrvk\" (UniqueName: \"kubernetes.io/projected/8e0bcf47-633a-44d1-82b8-90cdf74fa610-kube-api-access-kbrvk\") pod \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.118715 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-catalog-content\") pod \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.119065 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-utilities\") pod \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\" (UID: \"8e0bcf47-633a-44d1-82b8-90cdf74fa610\") " Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.121478 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-utilities" (OuterVolumeSpecName: "utilities") pod "8e0bcf47-633a-44d1-82b8-90cdf74fa610" (UID: "8e0bcf47-633a-44d1-82b8-90cdf74fa610"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.126096 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e0bcf47-633a-44d1-82b8-90cdf74fa610-kube-api-access-kbrvk" (OuterVolumeSpecName: "kube-api-access-kbrvk") pod "8e0bcf47-633a-44d1-82b8-90cdf74fa610" (UID: "8e0bcf47-633a-44d1-82b8-90cdf74fa610"). InnerVolumeSpecName "kube-api-access-kbrvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.168097 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e0bcf47-633a-44d1-82b8-90cdf74fa610" (UID: "8e0bcf47-633a-44d1-82b8-90cdf74fa610"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.220619 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.220671 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbrvk\" (UniqueName: \"kubernetes.io/projected/8e0bcf47-633a-44d1-82b8-90cdf74fa610-kube-api-access-kbrvk\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.220685 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e0bcf47-633a-44d1-82b8-90cdf74fa610-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.960507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pwg2w" event={"ID":"8e0bcf47-633a-44d1-82b8-90cdf74fa610","Type":"ContainerDied","Data":"1c9b8765a3fd2f0dce4d476ef2378e62fb0f9bba891c42739eb0e8b677f74015"} Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.960614 4922 scope.go:117] "RemoveContainer" containerID="71d47f3f315725f900eb27aa0cb85095e1e92b23e0a1bb056c388cd0f8d6cce5" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.960550 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pwg2w" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.981948 4922 scope.go:117] "RemoveContainer" containerID="5e1d92bcf17a145dac05d300aa5b54c54ef3c7116e52e2c4c5d40a882d4a6fc0" Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.985365 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pwg2w"] Nov 22 02:56:21 crc kubenswrapper[4922]: I1122 02:56:21.988608 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pwg2w"] Nov 22 02:56:22 crc kubenswrapper[4922]: I1122 02:56:22.005089 4922 scope.go:117] "RemoveContainer" containerID="f198c8fef76695ca81f120890c1708414f11bdd9bb8fd80fa08358116078d8a6" Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.308043 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" path="/var/lib/kubelet/pods/8e0bcf47-633a-44d1-82b8-90cdf74fa610/volumes" Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.598795 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5d2"] Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.599051 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tb5d2" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="registry-server" containerID="cri-o://d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed" gracePeriod=2 Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.939092 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.973520 4922 generic.go:334] "Generic (PLEG): container finished" podID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerID="d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed" exitCode=0 Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.973570 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5d2" event={"ID":"93a35959-93aa-4e0e-a171-802b0161fe5c","Type":"ContainerDied","Data":"d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed"} Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.973601 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tb5d2" event={"ID":"93a35959-93aa-4e0e-a171-802b0161fe5c","Type":"ContainerDied","Data":"e5d50f964d5676eaf1f3a8143f5fd2b0d18efb2a97659d815ffb745372bf8cae"} Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.973621 4922 scope.go:117] "RemoveContainer" containerID="d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed" Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.973743 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tb5d2" Nov 22 02:56:23 crc kubenswrapper[4922]: I1122 02:56:23.999675 4922 scope.go:117] "RemoveContainer" containerID="4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.025343 4922 scope.go:117] "RemoveContainer" containerID="c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.044526 4922 scope.go:117] "RemoveContainer" containerID="d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed" Nov 22 02:56:24 crc kubenswrapper[4922]: E1122 02:56:24.045148 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed\": container with ID starting with d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed not found: ID does not exist" containerID="d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.045222 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed"} err="failed to get container status \"d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed\": rpc error: code = NotFound desc = could not find container \"d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed\": container with ID starting with d273df1a337b3b3445e968274eff089186e7249a9d60ddd7716081355d9b72ed not found: ID does not exist" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.045279 4922 scope.go:117] "RemoveContainer" containerID="4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe" Nov 22 02:56:24 crc kubenswrapper[4922]: E1122 02:56:24.045666 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe\": container with ID starting with 4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe not found: ID does not exist" containerID="4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.045697 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe"} err="failed to get container status \"4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe\": rpc error: code = NotFound desc = could not find container \"4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe\": container with ID starting with 4152aa7869961531164116b3c3012c8116cac7e1fe8cc184d6afeb48fe829efe not found: ID does not exist" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.045722 4922 scope.go:117] "RemoveContainer" containerID="c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082" Nov 22 02:56:24 crc kubenswrapper[4922]: E1122 02:56:24.046244 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082\": container with ID starting with c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082 not found: ID does not exist" containerID="c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.046299 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082"} err="failed to get container status \"c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082\": rpc error: code = NotFound desc = could not find container \"c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082\": container with ID starting with c17217b2122f9381e8f6e20025c9187d99c0a4a10829e3b4ad2e6f9b26782082 not found: ID does not exist" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.056579 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-utilities\") pod \"93a35959-93aa-4e0e-a171-802b0161fe5c\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.056628 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldl7t\" (UniqueName: \"kubernetes.io/projected/93a35959-93aa-4e0e-a171-802b0161fe5c-kube-api-access-ldl7t\") pod \"93a35959-93aa-4e0e-a171-802b0161fe5c\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.056664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-catalog-content\") pod \"93a35959-93aa-4e0e-a171-802b0161fe5c\" (UID: \"93a35959-93aa-4e0e-a171-802b0161fe5c\") " Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.059199 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-utilities" (OuterVolumeSpecName: "utilities") pod "93a35959-93aa-4e0e-a171-802b0161fe5c" (UID: "93a35959-93aa-4e0e-a171-802b0161fe5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.064004 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a35959-93aa-4e0e-a171-802b0161fe5c-kube-api-access-ldl7t" (OuterVolumeSpecName: "kube-api-access-ldl7t") pod "93a35959-93aa-4e0e-a171-802b0161fe5c" (UID: "93a35959-93aa-4e0e-a171-802b0161fe5c"). InnerVolumeSpecName "kube-api-access-ldl7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.074431 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93a35959-93aa-4e0e-a171-802b0161fe5c" (UID: "93a35959-93aa-4e0e-a171-802b0161fe5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.160176 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.160226 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93a35959-93aa-4e0e-a171-802b0161fe5c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.160241 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldl7t\" (UniqueName: \"kubernetes.io/projected/93a35959-93aa-4e0e-a171-802b0161fe5c-kube-api-access-ldl7t\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.307197 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5d2"] Nov 22 02:56:24 crc kubenswrapper[4922]: I1122 02:56:24.316637 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tb5d2"] Nov 22 02:56:25 crc kubenswrapper[4922]: I1122 02:56:25.307208 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" path="/var/lib/kubelet/pods/93a35959-93aa-4e0e-a171-802b0161fe5c/volumes" Nov 22 02:56:27 crc kubenswrapper[4922]: I1122 02:56:27.590684 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:56:27 crc kubenswrapper[4922]: I1122 02:56:27.591015 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:56:27 crc kubenswrapper[4922]: I1122 02:56:27.638752 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:56:27 crc kubenswrapper[4922]: I1122 02:56:27.915443 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:56:27 crc kubenswrapper[4922]: I1122 02:56:27.915581 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:56:27 crc kubenswrapper[4922]: I1122 02:56:27.966253 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:56:28 crc kubenswrapper[4922]: I1122 02:56:28.050584 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:56:28 crc kubenswrapper[4922]: I1122 02:56:28.052517 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:56:28 crc kubenswrapper[4922]: I1122 02:56:28.404675 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgckp"] Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.012270 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vgckp" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="registry-server" containerID="cri-o://951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2" gracePeriod=2 Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.368614 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.450907 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-utilities\") pod \"9e5138dd-6039-4020-969e-6a30b33be2b5\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.451046 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-catalog-content\") pod \"9e5138dd-6039-4020-969e-6a30b33be2b5\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.451086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7knj\" (UniqueName: \"kubernetes.io/projected/9e5138dd-6039-4020-969e-6a30b33be2b5-kube-api-access-n7knj\") pod \"9e5138dd-6039-4020-969e-6a30b33be2b5\" (UID: \"9e5138dd-6039-4020-969e-6a30b33be2b5\") " Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.451971 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-utilities" (OuterVolumeSpecName: "utilities") pod "9e5138dd-6039-4020-969e-6a30b33be2b5" (UID: "9e5138dd-6039-4020-969e-6a30b33be2b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.460558 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5138dd-6039-4020-969e-6a30b33be2b5-kube-api-access-n7knj" (OuterVolumeSpecName: "kube-api-access-n7knj") pod "9e5138dd-6039-4020-969e-6a30b33be2b5" (UID: "9e5138dd-6039-4020-969e-6a30b33be2b5"). InnerVolumeSpecName "kube-api-access-n7knj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.499037 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.499123 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.536477 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e5138dd-6039-4020-969e-6a30b33be2b5" (UID: "9e5138dd-6039-4020-969e-6a30b33be2b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.553027 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.553085 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7knj\" (UniqueName: \"kubernetes.io/projected/9e5138dd-6039-4020-969e-6a30b33be2b5-kube-api-access-n7knj\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.553106 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e5138dd-6039-4020-969e-6a30b33be2b5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.555138 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.932449 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:56:30 crc kubenswrapper[4922]: I1122 02:56:30.932532 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.000624 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.024092 4922 generic.go:334] "Generic (PLEG): container finished" podID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerID="951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2" exitCode=0 Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.024170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgckp" event={"ID":"9e5138dd-6039-4020-969e-6a30b33be2b5","Type":"ContainerDied","Data":"951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2"} Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.024363 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgckp" event={"ID":"9e5138dd-6039-4020-969e-6a30b33be2b5","Type":"ContainerDied","Data":"db59237923319cdd77327cbf28d6854f79046e97e7acae81bd0791cea2db0598"} Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.024417 4922 scope.go:117] "RemoveContainer" containerID="951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.025215 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgckp" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.055173 4922 scope.go:117] "RemoveContainer" containerID="2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.076919 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgckp"] Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.082520 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vgckp"] Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.102023 4922 scope.go:117] "RemoveContainer" containerID="8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.102167 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.103127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.122878 4922 scope.go:117] "RemoveContainer" containerID="951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2" Nov 22 02:56:31 crc kubenswrapper[4922]: E1122 02:56:31.124806 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2\": container with ID starting with 951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2 not found: ID does not exist" containerID="951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.124872 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2"} err="failed to get container status \"951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2\": rpc error: code = NotFound desc = could not find container \"951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2\": container with ID starting with 951299cffa792918e4dae11c1187b5a0d9fbe21ca15aae80e27470eede900dc2 not found: ID does not exist" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.124904 4922 scope.go:117] "RemoveContainer" containerID="2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f" Nov 22 02:56:31 crc kubenswrapper[4922]: E1122 02:56:31.125929 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f\": container with ID starting with 2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f not found: ID does not exist" containerID="2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.126158 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f"} err="failed to get container status \"2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f\": rpc error: code = NotFound desc = could not find container \"2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f\": container with ID starting with 2aa499e67046bc7dde0a73f96ee24aa87bd78bf846a364dd4e476660c990182f not found: ID does not exist" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.126256 4922 scope.go:117] "RemoveContainer" containerID="8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d" Nov 22 02:56:31 crc kubenswrapper[4922]: E1122 02:56:31.127883 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d\": container with ID starting with 8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d not found: ID does not exist" containerID="8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.127942 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d"} err="failed to get container status \"8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d\": rpc error: code = NotFound desc = could not find container \"8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d\": container with ID starting with 8f26670e650dedc5e3f802a76f413703e8d1b4ff46674d2c812f8ce1c0ef9f2d not found: ID does not exist" Nov 22 02:56:31 crc kubenswrapper[4922]: I1122 02:56:31.314343 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" path="/var/lib/kubelet/pods/9e5138dd-6039-4020-969e-6a30b33be2b5/volumes" Nov 22 02:56:32 crc kubenswrapper[4922]: I1122 02:56:32.806262 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkqw4"] Nov 22 02:56:33 crc kubenswrapper[4922]: I1122 02:56:33.041492 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkqw4" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="registry-server" containerID="cri-o://b362e5575be2a96d1315ef8a50f3dcc02aaec6355286129b82a6b715d13bdf9f" gracePeriod=2 Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.060025 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkqw4" event={"ID":"737abb10-2a68-40ba-a6c0-201ae0619ede","Type":"ContainerDied","Data":"b362e5575be2a96d1315ef8a50f3dcc02aaec6355286129b82a6b715d13bdf9f"} Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.060447 4922 generic.go:334] "Generic (PLEG): container finished" podID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerID="b362e5575be2a96d1315ef8a50f3dcc02aaec6355286129b82a6b715d13bdf9f" exitCode=0 Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.734875 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.838609 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-utilities\") pod \"737abb10-2a68-40ba-a6c0-201ae0619ede\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.838684 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-catalog-content\") pod \"737abb10-2a68-40ba-a6c0-201ae0619ede\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.838726 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rhr4\" (UniqueName: \"kubernetes.io/projected/737abb10-2a68-40ba-a6c0-201ae0619ede-kube-api-access-5rhr4\") pod \"737abb10-2a68-40ba-a6c0-201ae0619ede\" (UID: \"737abb10-2a68-40ba-a6c0-201ae0619ede\") " Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.840666 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-utilities" (OuterVolumeSpecName: "utilities") pod "737abb10-2a68-40ba-a6c0-201ae0619ede" (UID: "737abb10-2a68-40ba-a6c0-201ae0619ede"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.848368 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737abb10-2a68-40ba-a6c0-201ae0619ede-kube-api-access-5rhr4" (OuterVolumeSpecName: "kube-api-access-5rhr4") pod "737abb10-2a68-40ba-a6c0-201ae0619ede" (UID: "737abb10-2a68-40ba-a6c0-201ae0619ede"). InnerVolumeSpecName "kube-api-access-5rhr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.941353 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.941466 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rhr4\" (UniqueName: \"kubernetes.io/projected/737abb10-2a68-40ba-a6c0-201ae0619ede-kube-api-access-5rhr4\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:35 crc kubenswrapper[4922]: I1122 02:56:35.969249 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "737abb10-2a68-40ba-a6c0-201ae0619ede" (UID: "737abb10-2a68-40ba-a6c0-201ae0619ede"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.043179 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737abb10-2a68-40ba-a6c0-201ae0619ede-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.071682 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkqw4" event={"ID":"737abb10-2a68-40ba-a6c0-201ae0619ede","Type":"ContainerDied","Data":"e2f8726b88f7cf411f417c2c65e8f3e46038fc089586e63ace5feaa3b0872259"} Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.071743 4922 scope.go:117] "RemoveContainer" containerID="b362e5575be2a96d1315ef8a50f3dcc02aaec6355286129b82a6b715d13bdf9f" Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.071741 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkqw4" Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.106546 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkqw4"] Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.111715 4922 scope.go:117] "RemoveContainer" containerID="0566626e6cbed8532dae65cfecf73a8904b1dea46287d6dc8b629fd0e662d51b" Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.113268 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkqw4"] Nov 22 02:56:36 crc kubenswrapper[4922]: I1122 02:56:36.130417 4922 scope.go:117] "RemoveContainer" containerID="f82e4117235df77da823aecaba5d19a40cedbfdec0178a24ea3e5d2492dbe2d4" Nov 22 02:56:37 crc kubenswrapper[4922]: I1122 02:56:37.318130 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" path="/var/lib/kubelet/pods/737abb10-2a68-40ba-a6c0-201ae0619ede/volumes" Nov 22 02:56:39 crc kubenswrapper[4922]: I1122 02:56:39.572030 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7zzh7"] Nov 22 02:57:04 crc kubenswrapper[4922]: I1122 02:57:04.605577 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" podUID="f267ec6b-64da-4065-b3aa-2e66ac957118" containerName="oauth-openshift" containerID="cri-o://2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5" gracePeriod=15 Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.002991 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038479 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6686467b65-t5l98"] Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038725 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200c2995-b401-489c-a63c-016f6e186dd6" containerName="pruner" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038744 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="200c2995-b401-489c-a63c-016f6e186dd6" containerName="pruner" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038759 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038769 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038784 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038792 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038805 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038813 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038827 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038835 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038866 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="022b1992-84d3-45ee-ab66-32d93b5aefce" containerName="pruner" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038874 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="022b1992-84d3-45ee-ab66-32d93b5aefce" containerName="pruner" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038888 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038898 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038908 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038916 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038928 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038936 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038946 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038955 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038967 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038975 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="extract-utilities" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.038986 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.038995 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.039009 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039017 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="extract-content" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.039027 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039035 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.039046 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f267ec6b-64da-4065-b3aa-2e66ac957118" containerName="oauth-openshift" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039054 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f267ec6b-64da-4065-b3aa-2e66ac957118" containerName="oauth-openshift" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039277 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="022b1992-84d3-45ee-ab66-32d93b5aefce" containerName="pruner" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039293 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="737abb10-2a68-40ba-a6c0-201ae0619ede" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039308 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e0bcf47-633a-44d1-82b8-90cdf74fa610" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039319 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a35959-93aa-4e0e-a171-802b0161fe5c" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039333 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f267ec6b-64da-4065-b3aa-2e66ac957118" containerName="oauth-openshift" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039342 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5138dd-6039-4020-969e-6a30b33be2b5" containerName="registry-server" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039355 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="200c2995-b401-489c-a63c-016f6e186dd6" containerName="pruner" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.039910 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.052352 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6686467b65-t5l98"] Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187614 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-provider-selection\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187697 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-ocp-branding-template\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187758 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-serving-cert\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187797 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-trusted-ca-bundle\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187870 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxxl2\" (UniqueName: \"kubernetes.io/projected/f267ec6b-64da-4065-b3aa-2e66ac957118-kube-api-access-pxxl2\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187912 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-router-certs\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187949 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-error\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.187989 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-dir\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188026 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-session\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188065 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-login\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188102 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-cliconfig\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-policies\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-idp-0-file-data\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188241 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-service-ca\") pod \"f267ec6b-64da-4065-b3aa-2e66ac957118\" (UID: \"f267ec6b-64da-4065-b3aa-2e66ac957118\") " Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188456 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189360 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4ec92e11-a88b-4223-b2e8-24e660e3e120-audit-dir\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189436 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189479 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189513 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.188568 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189578 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189726 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-audit-policies\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189770 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-error\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189826 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-login\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-router-certs\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189909 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57j8m\" (UniqueName: \"kubernetes.io/projected/4ec92e11-a88b-4223-b2e8-24e660e3e120-kube-api-access-57j8m\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189945 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-service-ca\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189956 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.189982 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190208 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-session\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190371 4922 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190404 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190426 4922 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f267ec6b-64da-4065-b3aa-2e66ac957118-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190444 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.190870 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.195390 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.195666 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.196227 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.196686 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.196993 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.196998 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f267ec6b-64da-4065-b3aa-2e66ac957118-kube-api-access-pxxl2" (OuterVolumeSpecName: "kube-api-access-pxxl2") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "kube-api-access-pxxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.197128 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.197517 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.197815 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f267ec6b-64da-4065-b3aa-2e66ac957118" (UID: "f267ec6b-64da-4065-b3aa-2e66ac957118"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.287044 4922 generic.go:334] "Generic (PLEG): container finished" podID="f267ec6b-64da-4065-b3aa-2e66ac957118" containerID="2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5" exitCode=0 Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.287126 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" event={"ID":"f267ec6b-64da-4065-b3aa-2e66ac957118","Type":"ContainerDied","Data":"2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5"} Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.287188 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" event={"ID":"f267ec6b-64da-4065-b3aa-2e66ac957118","Type":"ContainerDied","Data":"b3b933d0f6880606c77fc6b3d42b6472ba327884ff46f457cf096806d13440d3"} Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.287217 4922 scope.go:117] "RemoveContainer" containerID="2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.287086 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7zzh7" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.291625 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-service-ca\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.291718 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.291804 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-session\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.291889 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.291943 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4ec92e11-a88b-4223-b2e8-24e660e3e120-audit-dir\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.291983 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292054 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292090 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292135 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-audit-policies\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292149 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4ec92e11-a88b-4223-b2e8-24e660e3e120-audit-dir\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-error\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292283 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-login\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-router-certs\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292373 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57j8m\" (UniqueName: \"kubernetes.io/projected/4ec92e11-a88b-4223-b2e8-24e660e3e120-kube-api-access-57j8m\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292402 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-service-ca\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.292586 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.293261 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.293273 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.293752 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.293901 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294009 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294111 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.293815 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4ec92e11-a88b-4223-b2e8-24e660e3e120-audit-policies\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294194 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxxl2\" (UniqueName: \"kubernetes.io/projected/f267ec6b-64da-4065-b3aa-2e66ac957118-kube-api-access-pxxl2\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294300 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294334 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294367 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.294396 4922 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f267ec6b-64da-4065-b3aa-2e66ac957118-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.296075 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-error\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.296293 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.296631 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-router-certs\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.296762 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-login\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.297047 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.298600 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.299066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-session\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.301905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ec92e11-a88b-4223-b2e8-24e660e3e120-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.310918 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57j8m\" (UniqueName: \"kubernetes.io/projected/4ec92e11-a88b-4223-b2e8-24e660e3e120-kube-api-access-57j8m\") pod \"oauth-openshift-6686467b65-t5l98\" (UID: \"4ec92e11-a88b-4223-b2e8-24e660e3e120\") " pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.317584 4922 scope.go:117] "RemoveContainer" containerID="2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5" Nov 22 02:57:05 crc kubenswrapper[4922]: E1122 02:57:05.318930 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5\": container with ID starting with 2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5 not found: ID does not exist" containerID="2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.319081 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5"} err="failed to get container status \"2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5\": rpc error: code = NotFound desc = could not find container \"2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5\": container with ID starting with 2c9264bdaf83279110abdad498554b08d4b6c98fb291e930780fef392749b6e5 not found: ID does not exist" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.321628 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7zzh7"] Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.324255 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7zzh7"] Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.365169 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:05 crc kubenswrapper[4922]: I1122 02:57:05.553490 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6686467b65-t5l98"] Nov 22 02:57:06 crc kubenswrapper[4922]: I1122 02:57:06.297416 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" event={"ID":"4ec92e11-a88b-4223-b2e8-24e660e3e120","Type":"ContainerStarted","Data":"49adee008afb0e9cf088bcf8f9c863d7eacb329d0a7e00466328140bfb68d779"} Nov 22 02:57:06 crc kubenswrapper[4922]: I1122 02:57:06.298074 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" event={"ID":"4ec92e11-a88b-4223-b2e8-24e660e3e120","Type":"ContainerStarted","Data":"d66bb205e7168586ea5236db0552ece47ec34015b9c266f4e98a02f849d5a270"} Nov 22 02:57:06 crc kubenswrapper[4922]: I1122 02:57:06.298116 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:06 crc kubenswrapper[4922]: I1122 02:57:06.329303 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" podStartSLOduration=27.32927922 podStartE2EDuration="27.32927922s" podCreationTimestamp="2025-11-22 02:56:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:57:06.327501339 +0000 UTC m=+262.366023261" watchObservedRunningTime="2025-11-22 02:57:06.32927922 +0000 UTC m=+262.367801102" Nov 22 02:57:06 crc kubenswrapper[4922]: I1122 02:57:06.431170 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6686467b65-t5l98" Nov 22 02:57:07 crc kubenswrapper[4922]: I1122 02:57:07.308606 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f267ec6b-64da-4065-b3aa-2e66ac957118" path="/var/lib/kubelet/pods/f267ec6b-64da-4065-b3aa-2e66ac957118/volumes" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.096839 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8sq5"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.097972 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h8sq5" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="registry-server" containerID="cri-o://347cf45992cba1a0276f51e25d8edae115792e875f79f58a2a49dc82cd91fae7" gracePeriod=30 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.104693 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t48nh"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.105275 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t48nh" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="registry-server" containerID="cri-o://66f64925599505b38d59a69a9c727a3433429335a4596cae4d15bb1ad0af0994" gracePeriod=30 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.115532 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cm895"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.115891 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" containerID="cri-o://4343efaf8b9adfbf4ae9a84598f357867c86b637dbdb7eabb66afbcbbfc57ab7" gracePeriod=30 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.132697 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p5clf"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.133119 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p5clf" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="registry-server" containerID="cri-o://763af7f08fda0b49cdab64db1c079b79698546641c3a94167177b765565f5d8d" gracePeriod=30 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.144988 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjwv8"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.146079 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.152056 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5m25"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.152383 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t5m25" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="registry-server" containerID="cri-o://68b21b04d55a18115bdd10e43258ea93509e8dd727fd564cbc02c2e44181b985" gracePeriod=30 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.155610 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjwv8"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.236240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.236310 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.236482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hqp\" (UniqueName: \"kubernetes.io/projected/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-kube-api-access-x6hqp\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.337810 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6hqp\" (UniqueName: \"kubernetes.io/projected/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-kube-api-access-x6hqp\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.337990 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.338058 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.340499 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.353469 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.361799 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6hqp\" (UniqueName: \"kubernetes.io/projected/2464e274-acb6-4ae6-aafb-c76c1a3a9ef0-kube-api-access-x6hqp\") pod \"marketplace-operator-79b997595-fjwv8\" (UID: \"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.413929 4922 generic.go:334] "Generic (PLEG): container finished" podID="156920b9-f91f-4053-be05-3be9c55f09b1" containerID="4343efaf8b9adfbf4ae9a84598f357867c86b637dbdb7eabb66afbcbbfc57ab7" exitCode=0 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.413991 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" event={"ID":"156920b9-f91f-4053-be05-3be9c55f09b1","Type":"ContainerDied","Data":"4343efaf8b9adfbf4ae9a84598f357867c86b637dbdb7eabb66afbcbbfc57ab7"} Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.417319 4922 generic.go:334] "Generic (PLEG): container finished" podID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerID="66f64925599505b38d59a69a9c727a3433429335a4596cae4d15bb1ad0af0994" exitCode=0 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.417372 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerDied","Data":"66f64925599505b38d59a69a9c727a3433429335a4596cae4d15bb1ad0af0994"} Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.419888 4922 generic.go:334] "Generic (PLEG): container finished" podID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerID="347cf45992cba1a0276f51e25d8edae115792e875f79f58a2a49dc82cd91fae7" exitCode=0 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.419894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8sq5" event={"ID":"6d590121-2d31-483c-9c86-14b40c9d23ad","Type":"ContainerDied","Data":"347cf45992cba1a0276f51e25d8edae115792e875f79f58a2a49dc82cd91fae7"} Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.427611 4922 generic.go:334] "Generic (PLEG): container finished" podID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerID="763af7f08fda0b49cdab64db1c079b79698546641c3a94167177b765565f5d8d" exitCode=0 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.427852 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p5clf" event={"ID":"24fb7ac2-dd84-402b-9c03-b0c29174fa6c","Type":"ContainerDied","Data":"763af7f08fda0b49cdab64db1c079b79698546641c3a94167177b765565f5d8d"} Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.431356 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerID="68b21b04d55a18115bdd10e43258ea93509e8dd727fd564cbc02c2e44181b985" exitCode=0 Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.431419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5m25" event={"ID":"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506","Type":"ContainerDied","Data":"68b21b04d55a18115bdd10e43258ea93509e8dd727fd564cbc02c2e44181b985"} Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.542254 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.551183 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.553196 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.559002 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.570918 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.587382 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750017 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmwqz\" (UniqueName: \"kubernetes.io/projected/6d590121-2d31-483c-9c86-14b40c9d23ad-kube-api-access-dmwqz\") pod \"6d590121-2d31-483c-9c86-14b40c9d23ad\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750098 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whrr7\" (UniqueName: \"kubernetes.io/projected/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-kube-api-access-whrr7\") pod \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750159 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26v49\" (UniqueName: \"kubernetes.io/projected/81c367c0-8b10-4ce9-aa76-290449c7df39-kube-api-access-26v49\") pod \"81c367c0-8b10-4ce9-aa76-290449c7df39\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750194 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-utilities\") pod \"81c367c0-8b10-4ce9-aa76-290449c7df39\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750234 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bbpq\" (UniqueName: \"kubernetes.io/projected/156920b9-f91f-4053-be05-3be9c55f09b1-kube-api-access-2bbpq\") pod \"156920b9-f91f-4053-be05-3be9c55f09b1\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750270 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-trusted-ca\") pod \"156920b9-f91f-4053-be05-3be9c55f09b1\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750309 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-utilities\") pod \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750333 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwhg7\" (UniqueName: \"kubernetes.io/projected/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-kube-api-access-qwhg7\") pod \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750359 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-operator-metrics\") pod \"156920b9-f91f-4053-be05-3be9c55f09b1\" (UID: \"156920b9-f91f-4053-be05-3be9c55f09b1\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750388 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-utilities\") pod \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750411 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-utilities\") pod \"6d590121-2d31-483c-9c86-14b40c9d23ad\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750454 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-catalog-content\") pod \"81c367c0-8b10-4ce9-aa76-290449c7df39\" (UID: \"81c367c0-8b10-4ce9-aa76-290449c7df39\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750479 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-catalog-content\") pod \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\" (UID: \"24fb7ac2-dd84-402b-9c03-b0c29174fa6c\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750510 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-catalog-content\") pod \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\" (UID: \"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.750550 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-catalog-content\") pod \"6d590121-2d31-483c-9c86-14b40c9d23ad\" (UID: \"6d590121-2d31-483c-9c86-14b40c9d23ad\") " Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.751600 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-utilities" (OuterVolumeSpecName: "utilities") pod "81c367c0-8b10-4ce9-aa76-290449c7df39" (UID: "81c367c0-8b10-4ce9-aa76-290449c7df39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.751778 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-utilities" (OuterVolumeSpecName: "utilities") pod "ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" (UID: "ffc1fad9-16d3-4a1d-83b7-0ffa9796d506"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752198 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "156920b9-f91f-4053-be05-3be9c55f09b1" (UID: "156920b9-f91f-4053-be05-3be9c55f09b1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752305 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-utilities" (OuterVolumeSpecName: "utilities") pod "24fb7ac2-dd84-402b-9c03-b0c29174fa6c" (UID: "24fb7ac2-dd84-402b-9c03-b0c29174fa6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752440 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752462 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752473 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752482 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.752829 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-utilities" (OuterVolumeSpecName: "utilities") pod "6d590121-2d31-483c-9c86-14b40c9d23ad" (UID: "6d590121-2d31-483c-9c86-14b40c9d23ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.762796 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156920b9-f91f-4053-be05-3be9c55f09b1-kube-api-access-2bbpq" (OuterVolumeSpecName: "kube-api-access-2bbpq") pod "156920b9-f91f-4053-be05-3be9c55f09b1" (UID: "156920b9-f91f-4053-be05-3be9c55f09b1"). InnerVolumeSpecName "kube-api-access-2bbpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.770852 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d590121-2d31-483c-9c86-14b40c9d23ad-kube-api-access-dmwqz" (OuterVolumeSpecName: "kube-api-access-dmwqz") pod "6d590121-2d31-483c-9c86-14b40c9d23ad" (UID: "6d590121-2d31-483c-9c86-14b40c9d23ad"). InnerVolumeSpecName "kube-api-access-dmwqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.770821 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-kube-api-access-whrr7" (OuterVolumeSpecName: "kube-api-access-whrr7") pod "ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" (UID: "ffc1fad9-16d3-4a1d-83b7-0ffa9796d506"). InnerVolumeSpecName "kube-api-access-whrr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.771015 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81c367c0-8b10-4ce9-aa76-290449c7df39-kube-api-access-26v49" (OuterVolumeSpecName: "kube-api-access-26v49") pod "81c367c0-8b10-4ce9-aa76-290449c7df39" (UID: "81c367c0-8b10-4ce9-aa76-290449c7df39"). InnerVolumeSpecName "kube-api-access-26v49". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.771133 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "156920b9-f91f-4053-be05-3be9c55f09b1" (UID: "156920b9-f91f-4053-be05-3be9c55f09b1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.771370 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-kube-api-access-qwhg7" (OuterVolumeSpecName: "kube-api-access-qwhg7") pod "24fb7ac2-dd84-402b-9c03-b0c29174fa6c" (UID: "24fb7ac2-dd84-402b-9c03-b0c29174fa6c"). InnerVolumeSpecName "kube-api-access-qwhg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.784138 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24fb7ac2-dd84-402b-9c03-b0c29174fa6c" (UID: "24fb7ac2-dd84-402b-9c03-b0c29174fa6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.794267 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fjwv8"] Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.824415 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81c367c0-8b10-4ce9-aa76-290449c7df39" (UID: "81c367c0-8b10-4ce9-aa76-290449c7df39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.836757 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d590121-2d31-483c-9c86-14b40c9d23ad" (UID: "6d590121-2d31-483c-9c86-14b40c9d23ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853262 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853294 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmwqz\" (UniqueName: \"kubernetes.io/projected/6d590121-2d31-483c-9c86-14b40c9d23ad-kube-api-access-dmwqz\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853308 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whrr7\" (UniqueName: \"kubernetes.io/projected/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-kube-api-access-whrr7\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853319 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26v49\" (UniqueName: \"kubernetes.io/projected/81c367c0-8b10-4ce9-aa76-290449c7df39-kube-api-access-26v49\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853329 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bbpq\" (UniqueName: \"kubernetes.io/projected/156920b9-f91f-4053-be05-3be9c55f09b1-kube-api-access-2bbpq\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853338 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwhg7\" (UniqueName: \"kubernetes.io/projected/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-kube-api-access-qwhg7\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853346 4922 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/156920b9-f91f-4053-be05-3be9c55f09b1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853356 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d590121-2d31-483c-9c86-14b40c9d23ad-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853365 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81c367c0-8b10-4ce9-aa76-290449c7df39-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.853374 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24fb7ac2-dd84-402b-9c03-b0c29174fa6c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.874218 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" (UID: "ffc1fad9-16d3-4a1d-83b7-0ffa9796d506"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 02:57:22 crc kubenswrapper[4922]: I1122 02:57:22.954115 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.438489 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p5clf" event={"ID":"24fb7ac2-dd84-402b-9c03-b0c29174fa6c","Type":"ContainerDied","Data":"cc382ecf9a92b22cca0a3f2688aea2bfc0395de880bdae43925ba3962a9c93dc"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.438545 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p5clf" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.438566 4922 scope.go:117] "RemoveContainer" containerID="763af7f08fda0b49cdab64db1c079b79698546641c3a94167177b765565f5d8d" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.441882 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t5m25" event={"ID":"ffc1fad9-16d3-4a1d-83b7-0ffa9796d506","Type":"ContainerDied","Data":"627d4df432e58721e6155267a27eb70229dcbceddb7ff494826c8c25e6e37e5e"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.442250 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t5m25" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.447484 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" event={"ID":"156920b9-f91f-4053-be05-3be9c55f09b1","Type":"ContainerDied","Data":"2d4ff02e389d8c499c92a70af4d463ed2adc19aabea98fdde7ad99fa15f8b7f6"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.447596 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cm895" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.461363 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t48nh" event={"ID":"81c367c0-8b10-4ce9-aa76-290449c7df39","Type":"ContainerDied","Data":"5c17d39ea7d0a421b8a86382a2730433358a35ae500d99138cf9baf7d6dc68e0"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.461464 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t48nh" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.463075 4922 scope.go:117] "RemoveContainer" containerID="94350a78648e6411b09238a3c99d2420901cde6b6f7fa6c980bc6bb18f75b91e" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.466397 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p5clf"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.467118 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h8sq5" event={"ID":"6d590121-2d31-483c-9c86-14b40c9d23ad","Type":"ContainerDied","Data":"93a744045948568508b5ffe24d5609a53ea8c737ec83296a34ee7369725698c7"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.468089 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h8sq5" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.469094 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" event={"ID":"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0","Type":"ContainerStarted","Data":"67e43432588c1932bd0720424c9c71c52182aeb2a6c1d9ea7b1ae9acd965411c"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.469120 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" event={"ID":"2464e274-acb6-4ae6-aafb-c76c1a3a9ef0","Type":"ContainerStarted","Data":"7e283addca6058927d2100a3e716f4f3bcef151351ad0742e15a833bf87eb068"} Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.469788 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.474878 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p5clf"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.479122 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.483782 4922 scope.go:117] "RemoveContainer" containerID="c90d10e537cc2cd12c5df2446276db764c0f47653d5755dbc8cd030e22d29177" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.491478 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cm895"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.497146 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cm895"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.500631 4922 scope.go:117] "RemoveContainer" containerID="68b21b04d55a18115bdd10e43258ea93509e8dd727fd564cbc02c2e44181b985" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.509582 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t5m25"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.513761 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t5m25"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.522962 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h8sq5"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.524282 4922 scope.go:117] "RemoveContainer" containerID="0227aa3d9941ff7fed12eff7ecb69b695fbbf959faeb89867825b96241718203" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.534313 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h8sq5"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.541899 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t48nh"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.544962 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t48nh"] Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.552516 4922 scope.go:117] "RemoveContainer" containerID="740d53a0c2f2741103c5571d667f9a427af264181bcfe988cc46ad4e1531cb54" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.576608 4922 scope.go:117] "RemoveContainer" containerID="4343efaf8b9adfbf4ae9a84598f357867c86b637dbdb7eabb66afbcbbfc57ab7" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.608774 4922 scope.go:117] "RemoveContainer" containerID="66f64925599505b38d59a69a9c727a3433429335a4596cae4d15bb1ad0af0994" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.624790 4922 scope.go:117] "RemoveContainer" containerID="dc55d9561147430ee52666bab42be8e3638492f4904b58197387a51f52d1d698" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.643644 4922 scope.go:117] "RemoveContainer" containerID="66140b440ffe6cf4718bad228bf26dbc3dad92ec5b56c1a06b6349d953317f32" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.660815 4922 scope.go:117] "RemoveContainer" containerID="347cf45992cba1a0276f51e25d8edae115792e875f79f58a2a49dc82cd91fae7" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.674347 4922 scope.go:117] "RemoveContainer" containerID="ed691218f4a3e4c3427b3c4d6cc696a3a1b5f09311b9da52ee0e25009e0af5e6" Nov 22 02:57:23 crc kubenswrapper[4922]: I1122 02:57:23.687746 4922 scope.go:117] "RemoveContainer" containerID="0437c01f59fb1ca95f6f70e75a196ccc33af4fb651d9a7df01093d73e391a42c" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.315340 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fjwv8" podStartSLOduration=2.315304537 podStartE2EDuration="2.315304537s" podCreationTimestamp="2025-11-22 02:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 02:57:23.557455614 +0000 UTC m=+279.595977506" watchObservedRunningTime="2025-11-22 02:57:24.315304537 +0000 UTC m=+280.353826469" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.320831 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pdqrh"] Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.321395 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.321441 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.321477 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.321505 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.321536 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.321553 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.321823 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.321881 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.321969 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.321991 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322016 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322034 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322054 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322068 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322090 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322103 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322118 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322131 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322150 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322164 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322183 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322224 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="extract-utilities" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322247 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322271 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="extract-content" Nov 22 02:57:24 crc kubenswrapper[4922]: E1122 02:57:24.322292 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322307 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322540 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322594 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322612 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322635 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" containerName="registry-server" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.322655 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" containerName="marketplace-operator" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.324291 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.329229 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.332281 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdqrh"] Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.470310 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86vn\" (UniqueName: \"kubernetes.io/projected/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-kube-api-access-z86vn\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.470451 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-catalog-content\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.470563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-utilities\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.516743 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rhmvs"] Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.518341 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.521676 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.533744 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhmvs"] Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.573030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z86vn\" (UniqueName: \"kubernetes.io/projected/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-kube-api-access-z86vn\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.573114 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-catalog-content\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.573622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-utilities\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.574327 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-utilities\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.574412 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-catalog-content\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.601046 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86vn\" (UniqueName: \"kubernetes.io/projected/5d6b66b3-7949-46a0-9242-2ce57ca56ecd-kube-api-access-z86vn\") pod \"redhat-marketplace-pdqrh\" (UID: \"5d6b66b3-7949-46a0-9242-2ce57ca56ecd\") " pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.646416 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.674631 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01170b3c-6c7d-4aee-9016-518e2d155464-utilities\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.674750 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9gp\" (UniqueName: \"kubernetes.io/projected/01170b3c-6c7d-4aee-9016-518e2d155464-kube-api-access-xm9gp\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.674825 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01170b3c-6c7d-4aee-9016-518e2d155464-catalog-content\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.776043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01170b3c-6c7d-4aee-9016-518e2d155464-catalog-content\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.776640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01170b3c-6c7d-4aee-9016-518e2d155464-utilities\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.776692 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9gp\" (UniqueName: \"kubernetes.io/projected/01170b3c-6c7d-4aee-9016-518e2d155464-kube-api-access-xm9gp\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.777308 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01170b3c-6c7d-4aee-9016-518e2d155464-utilities\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.779615 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01170b3c-6c7d-4aee-9016-518e2d155464-catalog-content\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.800433 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9gp\" (UniqueName: \"kubernetes.io/projected/01170b3c-6c7d-4aee-9016-518e2d155464-kube-api-access-xm9gp\") pod \"redhat-operators-rhmvs\" (UID: \"01170b3c-6c7d-4aee-9016-518e2d155464\") " pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.843369 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pdqrh"] Nov 22 02:57:24 crc kubenswrapper[4922]: I1122 02:57:24.845881 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.055585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhmvs"] Nov 22 02:57:25 crc kubenswrapper[4922]: W1122 02:57:25.063950 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01170b3c_6c7d_4aee_9016_518e2d155464.slice/crio-69864f600b953a016e3309e4a2590b63449e1395588982dd93a55dbee0ce7469 WatchSource:0}: Error finding container 69864f600b953a016e3309e4a2590b63449e1395588982dd93a55dbee0ce7469: Status 404 returned error can't find the container with id 69864f600b953a016e3309e4a2590b63449e1395588982dd93a55dbee0ce7469 Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.309427 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156920b9-f91f-4053-be05-3be9c55f09b1" path="/var/lib/kubelet/pods/156920b9-f91f-4053-be05-3be9c55f09b1/volumes" Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.310221 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24fb7ac2-dd84-402b-9c03-b0c29174fa6c" path="/var/lib/kubelet/pods/24fb7ac2-dd84-402b-9c03-b0c29174fa6c/volumes" Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.310857 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d590121-2d31-483c-9c86-14b40c9d23ad" path="/var/lib/kubelet/pods/6d590121-2d31-483c-9c86-14b40c9d23ad/volumes" Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.311966 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81c367c0-8b10-4ce9-aa76-290449c7df39" path="/var/lib/kubelet/pods/81c367c0-8b10-4ce9-aa76-290449c7df39/volumes" Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.312527 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc1fad9-16d3-4a1d-83b7-0ffa9796d506" path="/var/lib/kubelet/pods/ffc1fad9-16d3-4a1d-83b7-0ffa9796d506/volumes" Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.491098 4922 generic.go:334] "Generic (PLEG): container finished" podID="5d6b66b3-7949-46a0-9242-2ce57ca56ecd" containerID="7fafc93368dd79e6b31f17dc900958b5479e6422d03584128b8250f9716905aa" exitCode=0 Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.491178 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdqrh" event={"ID":"5d6b66b3-7949-46a0-9242-2ce57ca56ecd","Type":"ContainerDied","Data":"7fafc93368dd79e6b31f17dc900958b5479e6422d03584128b8250f9716905aa"} Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.491214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdqrh" event={"ID":"5d6b66b3-7949-46a0-9242-2ce57ca56ecd","Type":"ContainerStarted","Data":"8302767971cd1ebbba6c397b4f9be2850e82de5580373c1355640fd0a117d671"} Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.493929 4922 generic.go:334] "Generic (PLEG): container finished" podID="01170b3c-6c7d-4aee-9016-518e2d155464" containerID="3a91d43756bbb1b459226e077306f78a4bcbb1275e04819f8507d2c8f915c719" exitCode=0 Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.494836 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhmvs" event={"ID":"01170b3c-6c7d-4aee-9016-518e2d155464","Type":"ContainerDied","Data":"3a91d43756bbb1b459226e077306f78a4bcbb1275e04819f8507d2c8f915c719"} Nov 22 02:57:25 crc kubenswrapper[4922]: I1122 02:57:25.494895 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhmvs" event={"ID":"01170b3c-6c7d-4aee-9016-518e2d155464","Type":"ContainerStarted","Data":"69864f600b953a016e3309e4a2590b63449e1395588982dd93a55dbee0ce7469"} Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.717586 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5xxh"] Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.719260 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.722658 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.730218 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5xxh"] Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.906428 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-catalog-content\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.906475 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-utilities\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.906547 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wttrc\" (UniqueName: \"kubernetes.io/projected/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-kube-api-access-wttrc\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.915143 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llzn6"] Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.916354 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.920523 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 02:57:26 crc kubenswrapper[4922]: I1122 02:57:26.921221 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llzn6"] Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.007362 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-catalog-content\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.007823 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-catalog-content\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.007899 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8263434-57ca-4230-850a-ae927db99cb8-catalog-content\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.007931 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-utilities\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.007961 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8263434-57ca-4230-850a-ae927db99cb8-utilities\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.007998 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wttrc\" (UniqueName: \"kubernetes.io/projected/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-kube-api-access-wttrc\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.008272 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-utilities\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.008365 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqxdp\" (UniqueName: \"kubernetes.io/projected/b8263434-57ca-4230-850a-ae927db99cb8-kube-api-access-hqxdp\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.030646 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wttrc\" (UniqueName: \"kubernetes.io/projected/4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe-kube-api-access-wttrc\") pod \"certified-operators-b5xxh\" (UID: \"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe\") " pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.037375 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.110007 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqxdp\" (UniqueName: \"kubernetes.io/projected/b8263434-57ca-4230-850a-ae927db99cb8-kube-api-access-hqxdp\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.110080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8263434-57ca-4230-850a-ae927db99cb8-catalog-content\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.110114 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8263434-57ca-4230-850a-ae927db99cb8-utilities\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.110501 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8263434-57ca-4230-850a-ae927db99cb8-utilities\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.110984 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8263434-57ca-4230-850a-ae927db99cb8-catalog-content\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.129542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqxdp\" (UniqueName: \"kubernetes.io/projected/b8263434-57ca-4230-850a-ae927db99cb8-kube-api-access-hqxdp\") pod \"community-operators-llzn6\" (UID: \"b8263434-57ca-4230-850a-ae927db99cb8\") " pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.250449 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.452939 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llzn6"] Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.508990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5xxh"] Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.523378 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhmvs" event={"ID":"01170b3c-6c7d-4aee-9016-518e2d155464","Type":"ContainerStarted","Data":"cd381edc900b93624770089a8a58d44f72b467e7010f70f98bf5488079583df0"} Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.526868 4922 generic.go:334] "Generic (PLEG): container finished" podID="5d6b66b3-7949-46a0-9242-2ce57ca56ecd" containerID="d0dc14284ac46d7ddf543e1c8a6fa41e5e41ae978decb3d7ed2eb191deccb07b" exitCode=0 Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.526927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdqrh" event={"ID":"5d6b66b3-7949-46a0-9242-2ce57ca56ecd","Type":"ContainerDied","Data":"d0dc14284ac46d7ddf543e1c8a6fa41e5e41ae978decb3d7ed2eb191deccb07b"} Nov 22 02:57:27 crc kubenswrapper[4922]: I1122 02:57:27.529025 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llzn6" event={"ID":"b8263434-57ca-4230-850a-ae927db99cb8","Type":"ContainerStarted","Data":"16ad86300403fb9734268c359d5e97e1ec9e45b7464a7e6b91a429902f241b93"} Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.541099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pdqrh" event={"ID":"5d6b66b3-7949-46a0-9242-2ce57ca56ecd","Type":"ContainerStarted","Data":"41769a0bbef469d3222b6ec3f8b88fb71f653ac9532e701c4c7e332d704cec9a"} Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.543612 4922 generic.go:334] "Generic (PLEG): container finished" podID="4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe" containerID="79a786557ab6fb0591a06f5e9427addc235c47ba0ee7e5df070ae1cfd950df50" exitCode=0 Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.543749 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5xxh" event={"ID":"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe","Type":"ContainerDied","Data":"79a786557ab6fb0591a06f5e9427addc235c47ba0ee7e5df070ae1cfd950df50"} Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.543838 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5xxh" event={"ID":"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe","Type":"ContainerStarted","Data":"7617bede455b59860f28c001c81fd73b840bfde8f4601d85bcefed51fe7ece5e"} Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.546002 4922 generic.go:334] "Generic (PLEG): container finished" podID="b8263434-57ca-4230-850a-ae927db99cb8" containerID="edbfda14ee0162b8017725ba3ee49538f04d88f77b61c1865bf544744e11099d" exitCode=0 Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.546047 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llzn6" event={"ID":"b8263434-57ca-4230-850a-ae927db99cb8","Type":"ContainerDied","Data":"edbfda14ee0162b8017725ba3ee49538f04d88f77b61c1865bf544744e11099d"} Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.549756 4922 generic.go:334] "Generic (PLEG): container finished" podID="01170b3c-6c7d-4aee-9016-518e2d155464" containerID="cd381edc900b93624770089a8a58d44f72b467e7010f70f98bf5488079583df0" exitCode=0 Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.549802 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhmvs" event={"ID":"01170b3c-6c7d-4aee-9016-518e2d155464","Type":"ContainerDied","Data":"cd381edc900b93624770089a8a58d44f72b467e7010f70f98bf5488079583df0"} Nov 22 02:57:28 crc kubenswrapper[4922]: I1122 02:57:28.563723 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pdqrh" podStartSLOduration=2.049546155 podStartE2EDuration="4.563691515s" podCreationTimestamp="2025-11-22 02:57:24 +0000 UTC" firstStartedPulling="2025-11-22 02:57:25.493533892 +0000 UTC m=+281.532055784" lastFinishedPulling="2025-11-22 02:57:28.007679252 +0000 UTC m=+284.046201144" observedRunningTime="2025-11-22 02:57:28.563125432 +0000 UTC m=+284.601647324" watchObservedRunningTime="2025-11-22 02:57:28.563691515 +0000 UTC m=+284.602213417" Nov 22 02:57:29 crc kubenswrapper[4922]: I1122 02:57:29.557143 4922 generic.go:334] "Generic (PLEG): container finished" podID="b8263434-57ca-4230-850a-ae927db99cb8" containerID="a191f4d392e0cbcb2b368dc8cd3788c136b032229bec79b5d240768c74268480" exitCode=0 Nov 22 02:57:29 crc kubenswrapper[4922]: I1122 02:57:29.557498 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llzn6" event={"ID":"b8263434-57ca-4230-850a-ae927db99cb8","Type":"ContainerDied","Data":"a191f4d392e0cbcb2b368dc8cd3788c136b032229bec79b5d240768c74268480"} Nov 22 02:57:29 crc kubenswrapper[4922]: I1122 02:57:29.563196 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhmvs" event={"ID":"01170b3c-6c7d-4aee-9016-518e2d155464","Type":"ContainerStarted","Data":"be2d4bc102f46da8ca2576117a7f469ad202a00e5aa2b3ded3114254ee778bff"} Nov 22 02:57:29 crc kubenswrapper[4922]: I1122 02:57:29.600738 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rhmvs" podStartSLOduration=2.102182004 podStartE2EDuration="5.600709867s" podCreationTimestamp="2025-11-22 02:57:24 +0000 UTC" firstStartedPulling="2025-11-22 02:57:25.496228416 +0000 UTC m=+281.534750308" lastFinishedPulling="2025-11-22 02:57:28.994756279 +0000 UTC m=+285.033278171" observedRunningTime="2025-11-22 02:57:29.59063189 +0000 UTC m=+285.629153792" watchObservedRunningTime="2025-11-22 02:57:29.600709867 +0000 UTC m=+285.639231759" Nov 22 02:57:30 crc kubenswrapper[4922]: I1122 02:57:30.574759 4922 generic.go:334] "Generic (PLEG): container finished" podID="4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe" containerID="3bafa2c5745b8c197ed9530ed6cd117e982d5752873777324092d0356767e25a" exitCode=0 Nov 22 02:57:30 crc kubenswrapper[4922]: I1122 02:57:30.574863 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5xxh" event={"ID":"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe","Type":"ContainerDied","Data":"3bafa2c5745b8c197ed9530ed6cd117e982d5752873777324092d0356767e25a"} Nov 22 02:57:30 crc kubenswrapper[4922]: I1122 02:57:30.577445 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llzn6" event={"ID":"b8263434-57ca-4230-850a-ae927db99cb8","Type":"ContainerStarted","Data":"ef9e44ae128588d8a012d24dce2ffc2abca40bfe903f97748ace63829dc997d9"} Nov 22 02:57:30 crc kubenswrapper[4922]: I1122 02:57:30.640236 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llzn6" podStartSLOduration=3.247176929 podStartE2EDuration="4.640199768s" podCreationTimestamp="2025-11-22 02:57:26 +0000 UTC" firstStartedPulling="2025-11-22 02:57:28.547230989 +0000 UTC m=+284.585752871" lastFinishedPulling="2025-11-22 02:57:29.940253818 +0000 UTC m=+285.978775710" observedRunningTime="2025-11-22 02:57:30.637632367 +0000 UTC m=+286.676154299" watchObservedRunningTime="2025-11-22 02:57:30.640199768 +0000 UTC m=+286.678721690" Nov 22 02:57:32 crc kubenswrapper[4922]: I1122 02:57:32.592180 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5xxh" event={"ID":"4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe","Type":"ContainerStarted","Data":"c89124247899cdada98b275f217ec1d7c3a6b3a8d3d7f2c2ffe778f6a8b0d097"} Nov 22 02:57:32 crc kubenswrapper[4922]: I1122 02:57:32.620817 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5xxh" podStartSLOduration=3.837173362 podStartE2EDuration="6.620794312s" podCreationTimestamp="2025-11-22 02:57:26 +0000 UTC" firstStartedPulling="2025-11-22 02:57:28.546310047 +0000 UTC m=+284.584831949" lastFinishedPulling="2025-11-22 02:57:31.329931007 +0000 UTC m=+287.368452899" observedRunningTime="2025-11-22 02:57:32.618455317 +0000 UTC m=+288.656977209" watchObservedRunningTime="2025-11-22 02:57:32.620794312 +0000 UTC m=+288.659316204" Nov 22 02:57:34 crc kubenswrapper[4922]: I1122 02:57:34.646803 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:34 crc kubenswrapper[4922]: I1122 02:57:34.647028 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:34 crc kubenswrapper[4922]: I1122 02:57:34.712936 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:34 crc kubenswrapper[4922]: I1122 02:57:34.846160 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:34 crc kubenswrapper[4922]: I1122 02:57:34.847026 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:35 crc kubenswrapper[4922]: I1122 02:57:35.177008 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pdqrh" Nov 22 02:57:35 crc kubenswrapper[4922]: I1122 02:57:35.883481 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rhmvs" podUID="01170b3c-6c7d-4aee-9016-518e2d155464" containerName="registry-server" probeResult="failure" output=< Nov 22 02:57:35 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 02:57:35 crc kubenswrapper[4922]: > Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.038234 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.038287 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.101072 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.186013 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5xxh" Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.251517 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.251692 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:37 crc kubenswrapper[4922]: I1122 02:57:37.312235 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:38 crc kubenswrapper[4922]: I1122 02:57:38.201327 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llzn6" Nov 22 02:57:44 crc kubenswrapper[4922]: I1122 02:57:44.915960 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:57:44 crc kubenswrapper[4922]: I1122 02:57:44.991540 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rhmvs" Nov 22 02:58:11 crc kubenswrapper[4922]: I1122 02:58:11.109472 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:58:11 crc kubenswrapper[4922]: I1122 02:58:11.110379 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:58:41 crc kubenswrapper[4922]: I1122 02:58:41.110058 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:58:41 crc kubenswrapper[4922]: I1122 02:58:41.110959 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.110587 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.112986 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.113089 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.114188 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b8b749cd737c0f52e1c945ff82e807bb12860b0ce616473b37087ca3334ae08"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.114334 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://5b8b749cd737c0f52e1c945ff82e807bb12860b0ce616473b37087ca3334ae08" gracePeriod=600 Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.885683 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="5b8b749cd737c0f52e1c945ff82e807bb12860b0ce616473b37087ca3334ae08" exitCode=0 Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.885805 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"5b8b749cd737c0f52e1c945ff82e807bb12860b0ce616473b37087ca3334ae08"} Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.886749 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"04bb324b7ce09f87535d27cd3f61d191662bc844fa3e27e67f963dd00d9c92fb"} Nov 22 02:59:11 crc kubenswrapper[4922]: I1122 02:59:11.886802 4922 scope.go:117] "RemoveContainer" containerID="d254b4b131f36e4aae4019c29e8c53f3f8df991b85eb6f943fb6d2e9d79552f6" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.166787 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq"] Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.171560 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.175623 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.176283 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.193686 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq"] Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.212337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c756ee80-4d4a-4877-8849-7f5cc170ecf9-secret-volume\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.212738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c756ee80-4d4a-4877-8849-7f5cc170ecf9-config-volume\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.213144 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw29d\" (UniqueName: \"kubernetes.io/projected/c756ee80-4d4a-4877-8849-7f5cc170ecf9-kube-api-access-pw29d\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.314482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw29d\" (UniqueName: \"kubernetes.io/projected/c756ee80-4d4a-4877-8849-7f5cc170ecf9-kube-api-access-pw29d\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.314573 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c756ee80-4d4a-4877-8849-7f5cc170ecf9-secret-volume\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.314623 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c756ee80-4d4a-4877-8849-7f5cc170ecf9-config-volume\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.316793 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c756ee80-4d4a-4877-8849-7f5cc170ecf9-config-volume\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.329215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c756ee80-4d4a-4877-8849-7f5cc170ecf9-secret-volume\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.347188 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw29d\" (UniqueName: \"kubernetes.io/projected/c756ee80-4d4a-4877-8849-7f5cc170ecf9-kube-api-access-pw29d\") pod \"collect-profiles-29396340-k9lhq\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.503436 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:00 crc kubenswrapper[4922]: I1122 03:00:00.808199 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq"] Nov 22 03:00:01 crc kubenswrapper[4922]: I1122 03:00:01.257291 4922 generic.go:334] "Generic (PLEG): container finished" podID="c756ee80-4d4a-4877-8849-7f5cc170ecf9" containerID="0f35149bf9822ee3e83f466c6f558c730a8bf202250ee0aa16a8401eccac4f43" exitCode=0 Nov 22 03:00:01 crc kubenswrapper[4922]: I1122 03:00:01.257583 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" event={"ID":"c756ee80-4d4a-4877-8849-7f5cc170ecf9","Type":"ContainerDied","Data":"0f35149bf9822ee3e83f466c6f558c730a8bf202250ee0aa16a8401eccac4f43"} Nov 22 03:00:01 crc kubenswrapper[4922]: I1122 03:00:01.257834 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" event={"ID":"c756ee80-4d4a-4877-8849-7f5cc170ecf9","Type":"ContainerStarted","Data":"8a0b78204f2197449672bc968b4b9bccbd704333b2866c18411a0b50656b2f75"} Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.629348 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.762647 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c756ee80-4d4a-4877-8849-7f5cc170ecf9-config-volume\") pod \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.762911 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw29d\" (UniqueName: \"kubernetes.io/projected/c756ee80-4d4a-4877-8849-7f5cc170ecf9-kube-api-access-pw29d\") pod \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.762991 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c756ee80-4d4a-4877-8849-7f5cc170ecf9-secret-volume\") pod \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\" (UID: \"c756ee80-4d4a-4877-8849-7f5cc170ecf9\") " Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.764465 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c756ee80-4d4a-4877-8849-7f5cc170ecf9-config-volume" (OuterVolumeSpecName: "config-volume") pod "c756ee80-4d4a-4877-8849-7f5cc170ecf9" (UID: "c756ee80-4d4a-4877-8849-7f5cc170ecf9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.786940 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c756ee80-4d4a-4877-8849-7f5cc170ecf9-kube-api-access-pw29d" (OuterVolumeSpecName: "kube-api-access-pw29d") pod "c756ee80-4d4a-4877-8849-7f5cc170ecf9" (UID: "c756ee80-4d4a-4877-8849-7f5cc170ecf9"). InnerVolumeSpecName "kube-api-access-pw29d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.786963 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c756ee80-4d4a-4877-8849-7f5cc170ecf9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c756ee80-4d4a-4877-8849-7f5cc170ecf9" (UID: "c756ee80-4d4a-4877-8849-7f5cc170ecf9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.865003 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw29d\" (UniqueName: \"kubernetes.io/projected/c756ee80-4d4a-4877-8849-7f5cc170ecf9-kube-api-access-pw29d\") on node \"crc\" DevicePath \"\"" Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.865058 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c756ee80-4d4a-4877-8849-7f5cc170ecf9-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:00:02 crc kubenswrapper[4922]: I1122 03:00:02.865075 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c756ee80-4d4a-4877-8849-7f5cc170ecf9-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:00:03 crc kubenswrapper[4922]: I1122 03:00:03.275407 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" event={"ID":"c756ee80-4d4a-4877-8849-7f5cc170ecf9","Type":"ContainerDied","Data":"8a0b78204f2197449672bc968b4b9bccbd704333b2866c18411a0b50656b2f75"} Nov 22 03:00:03 crc kubenswrapper[4922]: I1122 03:00:03.275475 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0b78204f2197449672bc968b4b9bccbd704333b2866c18411a0b50656b2f75" Nov 22 03:00:03 crc kubenswrapper[4922]: I1122 03:00:03.275507 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.098015 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lxrm5"] Nov 22 03:01:10 crc kubenswrapper[4922]: E1122 03:01:10.099246 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c756ee80-4d4a-4877-8849-7f5cc170ecf9" containerName="collect-profiles" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.099265 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c756ee80-4d4a-4877-8849-7f5cc170ecf9" containerName="collect-profiles" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.099386 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c756ee80-4d4a-4877-8849-7f5cc170ecf9" containerName="collect-profiles" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.099991 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.112027 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lxrm5"] Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nr99\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-kube-api-access-2nr99\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249491 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-bound-sa-token\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249519 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-registry-certificates\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249670 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249705 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-trusted-ca\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-registry-tls\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.249781 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.275070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352181 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352333 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nr99\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-kube-api-access-2nr99\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352398 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-bound-sa-token\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352472 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-registry-certificates\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352565 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352649 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-trusted-ca\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.352737 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-registry-tls\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.353793 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.354764 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-trusted-ca\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.354818 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-registry-certificates\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.361094 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-registry-tls\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.374073 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.379204 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nr99\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-kube-api-access-2nr99\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.385527 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8-bound-sa-token\") pod \"image-registry-66df7c8f76-lxrm5\" (UID: \"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.420816 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.707100 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lxrm5"] Nov 22 03:01:10 crc kubenswrapper[4922]: I1122 03:01:10.801235 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" event={"ID":"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8","Type":"ContainerStarted","Data":"be9bd109a93b4e808a16f44f73ff20338435a13df4a8f7b8dc8829c807f6812d"} Nov 22 03:01:11 crc kubenswrapper[4922]: I1122 03:01:11.109945 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:01:11 crc kubenswrapper[4922]: I1122 03:01:11.110063 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:01:11 crc kubenswrapper[4922]: I1122 03:01:11.809130 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" event={"ID":"85ddc006-3f33-4ba1-9c4c-7ba9e9f0e3f8","Type":"ContainerStarted","Data":"5babb40b15536d1c512c1e935ed2b48acfdaaef2ba3881d726cd8495b8201e35"} Nov 22 03:01:11 crc kubenswrapper[4922]: I1122 03:01:11.809775 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:11 crc kubenswrapper[4922]: I1122 03:01:11.838483 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" podStartSLOduration=1.838461804 podStartE2EDuration="1.838461804s" podCreationTimestamp="2025-11-22 03:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:01:11.835552853 +0000 UTC m=+507.874074745" watchObservedRunningTime="2025-11-22 03:01:11.838461804 +0000 UTC m=+507.876983696" Nov 22 03:01:30 crc kubenswrapper[4922]: I1122 03:01:30.433179 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lxrm5" Nov 22 03:01:30 crc kubenswrapper[4922]: I1122 03:01:30.509969 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5hhlh"] Nov 22 03:01:41 crc kubenswrapper[4922]: I1122 03:01:41.110015 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:01:41 crc kubenswrapper[4922]: I1122 03:01:41.110769 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:01:55 crc kubenswrapper[4922]: I1122 03:01:55.563352 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" podUID="14638a05-727a-441a-88f2-f9750aa17a39" containerName="registry" containerID="cri-o://b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f" gracePeriod=30 Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.008816 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.092574 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14638a05-727a-441a-88f2-f9750aa17a39-ca-trust-extracted\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.092665 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-bound-sa-token\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.092734 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14638a05-727a-441a-88f2-f9750aa17a39-installation-pull-secrets\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.092775 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2r6j\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-kube-api-access-x2r6j\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.092817 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-registry-certificates\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.093159 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-registry-tls\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.093469 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.093558 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-trusted-ca\") pod \"14638a05-727a-441a-88f2-f9750aa17a39\" (UID: \"14638a05-727a-441a-88f2-f9750aa17a39\") " Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.095243 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.097137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.104634 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.105265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.105340 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14638a05-727a-441a-88f2-f9750aa17a39-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.106420 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-kube-api-access-x2r6j" (OuterVolumeSpecName: "kube-api-access-x2r6j") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "kube-api-access-x2r6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.115203 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.128449 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14638a05-727a-441a-88f2-f9750aa17a39-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "14638a05-727a-441a-88f2-f9750aa17a39" (UID: "14638a05-727a-441a-88f2-f9750aa17a39"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.139772 4922 generic.go:334] "Generic (PLEG): container finished" podID="14638a05-727a-441a-88f2-f9750aa17a39" containerID="b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f" exitCode=0 Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.139869 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" event={"ID":"14638a05-727a-441a-88f2-f9750aa17a39","Type":"ContainerDied","Data":"b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f"} Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.139922 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" event={"ID":"14638a05-727a-441a-88f2-f9750aa17a39","Type":"ContainerDied","Data":"cf78329aa32e055fc46de43fdfc0c9dc23b3d4b1ab33ecb87d23043a2f3a4c5c"} Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.139953 4922 scope.go:117] "RemoveContainer" containerID="b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.140128 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5hhlh" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.169556 4922 scope.go:117] "RemoveContainer" containerID="b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f" Nov 22 03:01:56 crc kubenswrapper[4922]: E1122 03:01:56.170150 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f\": container with ID starting with b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f not found: ID does not exist" containerID="b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.170200 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f"} err="failed to get container status \"b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f\": rpc error: code = NotFound desc = could not find container \"b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f\": container with ID starting with b50b0239e7aa515074294a49f98c23e1c0701b80c27f05f9002add1d1733683f not found: ID does not exist" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.189377 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5hhlh"] Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195176 4922 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195199 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195210 4922 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14638a05-727a-441a-88f2-f9750aa17a39-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195220 4922 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14638a05-727a-441a-88f2-f9750aa17a39-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195232 4922 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195244 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2r6j\" (UniqueName: \"kubernetes.io/projected/14638a05-727a-441a-88f2-f9750aa17a39-kube-api-access-x2r6j\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.195253 4922 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14638a05-727a-441a-88f2-f9750aa17a39-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 03:01:56 crc kubenswrapper[4922]: I1122 03:01:56.196161 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5hhlh"] Nov 22 03:01:57 crc kubenswrapper[4922]: I1122 03:01:57.315396 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14638a05-727a-441a-88f2-f9750aa17a39" path="/var/lib/kubelet/pods/14638a05-727a-441a-88f2-f9750aa17a39/volumes" Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.109913 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.111057 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.111179 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.122585 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04bb324b7ce09f87535d27cd3f61d191662bc844fa3e27e67f963dd00d9c92fb"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.122755 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://04bb324b7ce09f87535d27cd3f61d191662bc844fa3e27e67f963dd00d9c92fb" gracePeriod=600 Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.300175 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="04bb324b7ce09f87535d27cd3f61d191662bc844fa3e27e67f963dd00d9c92fb" exitCode=0 Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.312659 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"04bb324b7ce09f87535d27cd3f61d191662bc844fa3e27e67f963dd00d9c92fb"} Nov 22 03:02:11 crc kubenswrapper[4922]: I1122 03:02:11.312746 4922 scope.go:117] "RemoveContainer" containerID="5b8b749cd737c0f52e1c945ff82e807bb12860b0ce616473b37087ca3334ae08" Nov 22 03:02:12 crc kubenswrapper[4922]: I1122 03:02:12.312266 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"3adf25f358b8b1181f3ac3e402fdb1299c491a8f833369cdc996bbafe94e841c"} Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.509558 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gndzc"] Nov 22 03:02:25 crc kubenswrapper[4922]: E1122 03:02:25.510711 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14638a05-727a-441a-88f2-f9750aa17a39" containerName="registry" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.510732 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="14638a05-727a-441a-88f2-f9750aa17a39" containerName="registry" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.510897 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="14638a05-727a-441a-88f2-f9750aa17a39" containerName="registry" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.511414 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.513583 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-d7r4c" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.513980 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.516960 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.522748 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gndzc"] Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.534086 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zm8rg"] Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.540014 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-zm8rg" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.544716 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-lwdtz" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.553313 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x94nm"] Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.554399 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.556494 4922 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-b2x44" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.582153 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/45c21c53-9955-4d09-8b9c-668a96ecab5a-kube-api-access-4qsj6\") pod \"cert-manager-5b446d88c5-zm8rg\" (UID: \"45c21c53-9955-4d09-8b9c-668a96ecab5a\") " pod="cert-manager/cert-manager-5b446d88c5-zm8rg" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.603231 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zm8rg"] Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.612172 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x94nm"] Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.683158 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/45c21c53-9955-4d09-8b9c-668a96ecab5a-kube-api-access-4qsj6\") pod \"cert-manager-5b446d88c5-zm8rg\" (UID: \"45c21c53-9955-4d09-8b9c-668a96ecab5a\") " pod="cert-manager/cert-manager-5b446d88c5-zm8rg" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.683218 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkfd\" (UniqueName: \"kubernetes.io/projected/451f8b82-3178-4e4c-b134-32bea43520e0-kube-api-access-qgkfd\") pod \"cert-manager-webhook-5655c58dd6-x94nm\" (UID: \"451f8b82-3178-4e4c-b134-32bea43520e0\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.683241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5swv\" (UniqueName: \"kubernetes.io/projected/b14967da-15c3-4419-93fb-5bbc85265835-kube-api-access-q5swv\") pod \"cert-manager-cainjector-7f985d654d-gndzc\" (UID: \"b14967da-15c3-4419-93fb-5bbc85265835\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.717130 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qsj6\" (UniqueName: \"kubernetes.io/projected/45c21c53-9955-4d09-8b9c-668a96ecab5a-kube-api-access-4qsj6\") pod \"cert-manager-5b446d88c5-zm8rg\" (UID: \"45c21c53-9955-4d09-8b9c-668a96ecab5a\") " pod="cert-manager/cert-manager-5b446d88c5-zm8rg" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.784386 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkfd\" (UniqueName: \"kubernetes.io/projected/451f8b82-3178-4e4c-b134-32bea43520e0-kube-api-access-qgkfd\") pod \"cert-manager-webhook-5655c58dd6-x94nm\" (UID: \"451f8b82-3178-4e4c-b134-32bea43520e0\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.784562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5swv\" (UniqueName: \"kubernetes.io/projected/b14967da-15c3-4419-93fb-5bbc85265835-kube-api-access-q5swv\") pod \"cert-manager-cainjector-7f985d654d-gndzc\" (UID: \"b14967da-15c3-4419-93fb-5bbc85265835\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.802033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5swv\" (UniqueName: \"kubernetes.io/projected/b14967da-15c3-4419-93fb-5bbc85265835-kube-api-access-q5swv\") pod \"cert-manager-cainjector-7f985d654d-gndzc\" (UID: \"b14967da-15c3-4419-93fb-5bbc85265835\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.806976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkfd\" (UniqueName: \"kubernetes.io/projected/451f8b82-3178-4e4c-b134-32bea43520e0-kube-api-access-qgkfd\") pod \"cert-manager-webhook-5655c58dd6-x94nm\" (UID: \"451f8b82-3178-4e4c-b134-32bea43520e0\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.836228 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.870380 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-zm8rg" Nov 22 03:02:25 crc kubenswrapper[4922]: I1122 03:02:25.886965 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.135282 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zm8rg"] Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.146612 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.378895 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-gndzc"] Nov 22 03:02:26 crc kubenswrapper[4922]: W1122 03:02:26.388424 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb14967da_15c3_4419_93fb_5bbc85265835.slice/crio-edc6ad005fef0a1bfb947100d95372659f058cc37f50558f52a2cd26bfa5d2c4 WatchSource:0}: Error finding container edc6ad005fef0a1bfb947100d95372659f058cc37f50558f52a2cd26bfa5d2c4: Status 404 returned error can't find the container with id edc6ad005fef0a1bfb947100d95372659f058cc37f50558f52a2cd26bfa5d2c4 Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.391215 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x94nm"] Nov 22 03:02:26 crc kubenswrapper[4922]: W1122 03:02:26.392653 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451f8b82_3178_4e4c_b134_32bea43520e0.slice/crio-fbc03487b7fe4925c8135b51b74f593adc1277ec807826f26094643b0c4c521f WatchSource:0}: Error finding container fbc03487b7fe4925c8135b51b74f593adc1277ec807826f26094643b0c4c521f: Status 404 returned error can't find the container with id fbc03487b7fe4925c8135b51b74f593adc1277ec807826f26094643b0c4c521f Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.421826 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-zm8rg" event={"ID":"45c21c53-9955-4d09-8b9c-668a96ecab5a","Type":"ContainerStarted","Data":"18c764bc9a8d148eb45eccd5c069c6fe65ed7674c0cf13ec7468b27f21d99d7f"} Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.424080 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" event={"ID":"b14967da-15c3-4419-93fb-5bbc85265835","Type":"ContainerStarted","Data":"edc6ad005fef0a1bfb947100d95372659f058cc37f50558f52a2cd26bfa5d2c4"} Nov 22 03:02:26 crc kubenswrapper[4922]: I1122 03:02:26.425218 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" event={"ID":"451f8b82-3178-4e4c-b134-32bea43520e0","Type":"ContainerStarted","Data":"fbc03487b7fe4925c8135b51b74f593adc1277ec807826f26094643b0c4c521f"} Nov 22 03:02:30 crc kubenswrapper[4922]: I1122 03:02:30.476149 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:30 crc kubenswrapper[4922]: I1122 03:02:30.499990 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" podStartSLOduration=1.7488525830000001 podStartE2EDuration="5.499859283s" podCreationTimestamp="2025-11-22 03:02:25 +0000 UTC" firstStartedPulling="2025-11-22 03:02:26.397651267 +0000 UTC m=+582.436173179" lastFinishedPulling="2025-11-22 03:02:30.148657977 +0000 UTC m=+586.187179879" observedRunningTime="2025-11-22 03:02:30.497765283 +0000 UTC m=+586.536287205" watchObservedRunningTime="2025-11-22 03:02:30.499859283 +0000 UTC m=+586.538381185" Nov 22 03:02:31 crc kubenswrapper[4922]: I1122 03:02:31.483377 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" event={"ID":"b14967da-15c3-4419-93fb-5bbc85265835","Type":"ContainerStarted","Data":"dffbb068e832fc4bff07e410ee8ba999ffe794fddfb7a1990f12f33929e3b909"} Nov 22 03:02:31 crc kubenswrapper[4922]: I1122 03:02:31.486021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" event={"ID":"451f8b82-3178-4e4c-b134-32bea43520e0","Type":"ContainerStarted","Data":"52838c71e3bc13e8a25af125b781a5ff67211c8b4179e448a051a4be89469c86"} Nov 22 03:02:31 crc kubenswrapper[4922]: I1122 03:02:31.488550 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-zm8rg" event={"ID":"45c21c53-9955-4d09-8b9c-668a96ecab5a","Type":"ContainerStarted","Data":"83c47be8408eadaffe218ae1b29c3db06eb981ec651f639e024ebb533df0ab8f"} Nov 22 03:02:31 crc kubenswrapper[4922]: I1122 03:02:31.507392 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-gndzc" podStartSLOduration=2.653215013 podStartE2EDuration="6.507358923s" podCreationTimestamp="2025-11-22 03:02:25 +0000 UTC" firstStartedPulling="2025-11-22 03:02:26.391366486 +0000 UTC m=+582.429888368" lastFinishedPulling="2025-11-22 03:02:30.245510346 +0000 UTC m=+586.284032278" observedRunningTime="2025-11-22 03:02:31.506103462 +0000 UTC m=+587.544625394" watchObservedRunningTime="2025-11-22 03:02:31.507358923 +0000 UTC m=+587.545880855" Nov 22 03:02:31 crc kubenswrapper[4922]: I1122 03:02:31.534036 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-zm8rg" podStartSLOduration=2.530363007 podStartE2EDuration="6.533999133s" podCreationTimestamp="2025-11-22 03:02:25 +0000 UTC" firstStartedPulling="2025-11-22 03:02:26.146081987 +0000 UTC m=+582.184603889" lastFinishedPulling="2025-11-22 03:02:30.149718113 +0000 UTC m=+586.188240015" observedRunningTime="2025-11-22 03:02:31.526443602 +0000 UTC m=+587.564965534" watchObservedRunningTime="2025-11-22 03:02:31.533999133 +0000 UTC m=+587.572521085" Nov 22 03:02:35 crc kubenswrapper[4922]: I1122 03:02:35.891304 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-x94nm" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.151192 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c7wvg"] Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.152111 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-controller" containerID="cri-o://18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.152205 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="nbdb" containerID="cri-o://091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.152347 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="northd" containerID="cri-o://5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.152431 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.152494 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-node" containerID="cri-o://f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.152571 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-acl-logging" containerID="cri-o://7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.153034 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="sbdb" containerID="cri-o://183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.230758 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" containerID="cri-o://d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" gracePeriod=30 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.517589 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/3.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.525121 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovn-acl-logging/0.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.526285 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovn-controller/0.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.526829 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.533754 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/2.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.534722 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/1.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.534803 4922 generic.go:334] "Generic (PLEG): container finished" podID="954bb7b8-d710-4e1a-973e-78c04e685f30" containerID="48c067afb41b86f3c7a21f1573ef0c9a87523ea9d2bc5bd76723c492a588b7a7" exitCode=2 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.534900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerDied","Data":"48c067afb41b86f3c7a21f1573ef0c9a87523ea9d2bc5bd76723c492a588b7a7"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.534988 4922 scope.go:117] "RemoveContainer" containerID="00bf6c08a8a7b113c82bd6c0f6de3dde945667c2f025d7dc3a84aaff43b989d5" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.535780 4922 scope.go:117] "RemoveContainer" containerID="48c067afb41b86f3c7a21f1573ef0c9a87523ea9d2bc5bd76723c492a588b7a7" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.536142 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d4gbc_openshift-multus(954bb7b8-d710-4e1a-973e-78c04e685f30)\"" pod="openshift-multus/multus-d4gbc" podUID="954bb7b8-d710-4e1a-973e-78c04e685f30" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.540099 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovnkube-controller/3.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.545359 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovn-acl-logging/0.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.546204 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c7wvg_c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/ovn-controller/0.log" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.546826 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" exitCode=0 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.546918 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" exitCode=0 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.546909 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.546988 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547007 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547030 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.546942 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" exitCode=0 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547099 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" exitCode=0 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547129 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" exitCode=0 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547148 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" exitCode=0 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547184 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" exitCode=143 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547209 4922 generic.go:334] "Generic (PLEG): container finished" podID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" exitCode=143 Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547137 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547292 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547330 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547360 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547386 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547401 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547416 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547433 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547450 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547466 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547481 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547496 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547510 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547531 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547555 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547600 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547618 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547634 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547650 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547664 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547680 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547694 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547708 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547723 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547769 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547786 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547839 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547889 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547905 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547920 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547933 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547947 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547961 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547976 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.547998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c7wvg" event={"ID":"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7","Type":"ContainerDied","Data":"a7aace6d771bcddf4bbb75a7470e9ac820327bbfe0fb2b922a7aea241f3a641c"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548023 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548040 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548054 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548068 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548081 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548095 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548110 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548123 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548137 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.548151 4922 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.580045 4922 scope.go:117] "RemoveContainer" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.612955 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620209 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-ovn\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620339 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovn-node-metrics-cert\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620393 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620331 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620488 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-ovn-kubernetes\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620522 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-node-log\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620571 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-systemd\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620596 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620580 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620585 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-node-log" (OuterVolumeSpecName: "node-log") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620612 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-script-lib\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620866 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-log-socket\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620918 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-var-lib-openvswitch\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.620980 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-openvswitch\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621014 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-etc-openvswitch\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgzfn\" (UniqueName: \"kubernetes.io/projected/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-kube-api-access-zgzfn\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621096 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-bin\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-systemd-units\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621174 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-env-overrides\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621235 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-kubelet\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621266 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-slash\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621315 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-netd\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621349 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-config\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621379 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-netns\") pod \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\" (UID: \"c2a6bcd8-bb13-463b-b112-0df3cf90b5f7\") " Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621513 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621617 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621674 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-log-socket" (OuterVolumeSpecName: "log-socket") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621716 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621756 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.621790 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622158 4922 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622200 4922 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622222 4922 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-node-log\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622241 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622263 4922 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-log-socket\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622281 4922 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622300 4922 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622317 4922 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622335 4922 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622353 4922 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.622945 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.623005 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.623037 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.623085 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.623164 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-slash" (OuterVolumeSpecName: "host-slash") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.623631 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624005 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vlxn4"] Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624307 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624330 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624343 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-node" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624352 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-node" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624369 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-acl-logging" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624378 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-acl-logging" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624391 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kubecfg-setup" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624399 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kubecfg-setup" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624408 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624416 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624424 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="sbdb" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624431 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="sbdb" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624449 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624456 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624465 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624473 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624487 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="northd" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624527 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="northd" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624539 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="nbdb" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624547 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="nbdb" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624557 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624568 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624579 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624588 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624731 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="nbdb" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624745 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624753 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="northd" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624761 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624768 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624778 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624787 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="sbdb" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624798 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624806 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624814 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="kube-rbac-proxy-node" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624824 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovn-acl-logging" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.624969 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.624983 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.625117 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" containerName="ovnkube-controller" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.625683 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.628658 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.628812 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.630035 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-kube-api-access-zgzfn" (OuterVolumeSpecName: "kube-api-access-zgzfn") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "kube-api-access-zgzfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.646285 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" (UID: "c2a6bcd8-bb13-463b-b112-0df3cf90b5f7"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.649695 4922 scope.go:117] "RemoveContainer" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.683319 4922 scope.go:117] "RemoveContainer" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.700365 4922 scope.go:117] "RemoveContainer" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.717602 4922 scope.go:117] "RemoveContainer" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724284 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-run-ovn-kubernetes\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724326 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-etc-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724356 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovn-node-metrics-cert\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724386 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-systemd\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724428 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-log-socket\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724447 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-slash\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724467 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724494 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-ovn\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-node-log\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724558 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-cni-netd\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724582 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-var-lib-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724607 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-kubelet\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724635 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovnkube-script-lib\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724685 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-systemd-units\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724712 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-env-overrides\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724739 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vlf\" (UniqueName: \"kubernetes.io/projected/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-kube-api-access-x6vlf\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-cni-bin\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovnkube-config\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724888 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-run-netns\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724973 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.724991 4922 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725003 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgzfn\" (UniqueName: \"kubernetes.io/projected/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-kube-api-access-zgzfn\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725015 4922 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725025 4922 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725036 4922 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725045 4922 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725052 4922 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-slash\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725060 4922 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.725068 4922 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.746232 4922 scope.go:117] "RemoveContainer" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.769968 4922 scope.go:117] "RemoveContainer" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.791332 4922 scope.go:117] "RemoveContainer" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.812122 4922 scope.go:117] "RemoveContainer" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826029 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-run-netns\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-run-ovn-kubernetes\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826125 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-etc-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826156 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovn-node-metrics-cert\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-systemd\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826213 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-log-socket\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826207 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-run-netns\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826247 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-slash\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826315 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826364 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-systemd\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826387 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-ovn\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826403 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-etc-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826434 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-run-ovn-kubernetes\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826425 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-node-log\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826546 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826553 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-ovn\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826594 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-cni-netd\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826615 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-slash\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826614 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-run-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826666 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-node-log\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826694 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-var-lib-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826736 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-var-lib-openvswitch\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826697 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-cni-netd\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826760 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-log-socket\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826839 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-kubelet\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826953 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-kubelet\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.826989 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovnkube-script-lib\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827067 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-systemd-units\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827111 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-env-overrides\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827176 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-systemd-units\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827263 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vlf\" (UniqueName: \"kubernetes.io/projected/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-kube-api-access-x6vlf\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827305 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-cni-bin\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827384 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovnkube-config\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.827474 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-host-cni-bin\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.828668 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-env-overrides\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.828681 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovnkube-script-lib\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.831597 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovnkube-config\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.834965 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-ovn-node-metrics-cert\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.838272 4922 scope.go:117] "RemoveContainer" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.839553 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": container with ID starting with d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba not found: ID does not exist" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.839602 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} err="failed to get container status \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": rpc error: code = NotFound desc = could not find container \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": container with ID starting with d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.839635 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.841171 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": container with ID starting with 31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa not found: ID does not exist" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.841223 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} err="failed to get container status \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": rpc error: code = NotFound desc = could not find container \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": container with ID starting with 31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.841268 4922 scope.go:117] "RemoveContainer" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.842235 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": container with ID starting with 183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea not found: ID does not exist" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.842273 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} err="failed to get container status \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": rpc error: code = NotFound desc = could not find container \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": container with ID starting with 183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.842296 4922 scope.go:117] "RemoveContainer" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.843641 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": container with ID starting with 091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d not found: ID does not exist" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.843770 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} err="failed to get container status \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": rpc error: code = NotFound desc = could not find container \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": container with ID starting with 091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.844091 4922 scope.go:117] "RemoveContainer" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.845403 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": container with ID starting with 5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7 not found: ID does not exist" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.845436 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} err="failed to get container status \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": rpc error: code = NotFound desc = could not find container \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": container with ID starting with 5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.845453 4922 scope.go:117] "RemoveContainer" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.845958 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": container with ID starting with cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f not found: ID does not exist" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.845999 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} err="failed to get container status \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": rpc error: code = NotFound desc = could not find container \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": container with ID starting with cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.846022 4922 scope.go:117] "RemoveContainer" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.846402 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": container with ID starting with f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f not found: ID does not exist" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.846458 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} err="failed to get container status \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": rpc error: code = NotFound desc = could not find container \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": container with ID starting with f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.846505 4922 scope.go:117] "RemoveContainer" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.847105 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": container with ID starting with 7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9 not found: ID does not exist" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.847163 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} err="failed to get container status \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": rpc error: code = NotFound desc = could not find container \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": container with ID starting with 7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.847192 4922 scope.go:117] "RemoveContainer" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.847720 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": container with ID starting with 18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf not found: ID does not exist" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.847750 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} err="failed to get container status \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": rpc error: code = NotFound desc = could not find container \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": container with ID starting with 18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.847775 4922 scope.go:117] "RemoveContainer" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" Nov 22 03:02:36 crc kubenswrapper[4922]: E1122 03:02:36.848217 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": container with ID starting with 4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2 not found: ID does not exist" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.848257 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} err="failed to get container status \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": rpc error: code = NotFound desc = could not find container \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": container with ID starting with 4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.848277 4922 scope.go:117] "RemoveContainer" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.848805 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} err="failed to get container status \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": rpc error: code = NotFound desc = could not find container \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": container with ID starting with d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.848884 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.849310 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} err="failed to get container status \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": rpc error: code = NotFound desc = could not find container \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": container with ID starting with 31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.849347 4922 scope.go:117] "RemoveContainer" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.850266 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} err="failed to get container status \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": rpc error: code = NotFound desc = could not find container \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": container with ID starting with 183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.850296 4922 scope.go:117] "RemoveContainer" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.850742 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} err="failed to get container status \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": rpc error: code = NotFound desc = could not find container \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": container with ID starting with 091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.850871 4922 scope.go:117] "RemoveContainer" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.851660 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} err="failed to get container status \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": rpc error: code = NotFound desc = could not find container \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": container with ID starting with 5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.851696 4922 scope.go:117] "RemoveContainer" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.852100 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} err="failed to get container status \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": rpc error: code = NotFound desc = could not find container \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": container with ID starting with cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.852139 4922 scope.go:117] "RemoveContainer" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.852537 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} err="failed to get container status \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": rpc error: code = NotFound desc = could not find container \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": container with ID starting with f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.852611 4922 scope.go:117] "RemoveContainer" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.853199 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} err="failed to get container status \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": rpc error: code = NotFound desc = could not find container \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": container with ID starting with 7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.853233 4922 scope.go:117] "RemoveContainer" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.853745 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} err="failed to get container status \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": rpc error: code = NotFound desc = could not find container \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": container with ID starting with 18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.853811 4922 scope.go:117] "RemoveContainer" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.854180 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} err="failed to get container status \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": rpc error: code = NotFound desc = could not find container \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": container with ID starting with 4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.854215 4922 scope.go:117] "RemoveContainer" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.854478 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} err="failed to get container status \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": rpc error: code = NotFound desc = could not find container \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": container with ID starting with d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.854511 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.854809 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} err="failed to get container status \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": rpc error: code = NotFound desc = could not find container \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": container with ID starting with 31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.854859 4922 scope.go:117] "RemoveContainer" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.855095 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} err="failed to get container status \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": rpc error: code = NotFound desc = could not find container \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": container with ID starting with 183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.855121 4922 scope.go:117] "RemoveContainer" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.855395 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} err="failed to get container status \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": rpc error: code = NotFound desc = could not find container \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": container with ID starting with 091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.855420 4922 scope.go:117] "RemoveContainer" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.855698 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} err="failed to get container status \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": rpc error: code = NotFound desc = could not find container \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": container with ID starting with 5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.855730 4922 scope.go:117] "RemoveContainer" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856083 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} err="failed to get container status \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": rpc error: code = NotFound desc = could not find container \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": container with ID starting with cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856126 4922 scope.go:117] "RemoveContainer" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856437 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} err="failed to get container status \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": rpc error: code = NotFound desc = could not find container \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": container with ID starting with f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856471 4922 scope.go:117] "RemoveContainer" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856502 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vlf\" (UniqueName: \"kubernetes.io/projected/5ae1edd6-6517-4fa6-8d5c-3aa64f62f387-kube-api-access-x6vlf\") pod \"ovnkube-node-vlxn4\" (UID: \"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856788 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} err="failed to get container status \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": rpc error: code = NotFound desc = could not find container \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": container with ID starting with 7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.856850 4922 scope.go:117] "RemoveContainer" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.857404 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} err="failed to get container status \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": rpc error: code = NotFound desc = could not find container \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": container with ID starting with 18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.857431 4922 scope.go:117] "RemoveContainer" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.858883 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} err="failed to get container status \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": rpc error: code = NotFound desc = could not find container \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": container with ID starting with 4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.858916 4922 scope.go:117] "RemoveContainer" containerID="d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.859359 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba"} err="failed to get container status \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": rpc error: code = NotFound desc = could not find container \"d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba\": container with ID starting with d98e05ccaa5607d33059defd7c078c35da084cd8b2bcfc6c3decc87d3023fbba not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.859390 4922 scope.go:117] "RemoveContainer" containerID="31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.859780 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa"} err="failed to get container status \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": rpc error: code = NotFound desc = could not find container \"31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa\": container with ID starting with 31c2ef05aecb9ad4cf8ad1f7c2ddc7726e5b94201af8425c07707fedea3d20aa not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.859814 4922 scope.go:117] "RemoveContainer" containerID="183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.860151 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea"} err="failed to get container status \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": rpc error: code = NotFound desc = could not find container \"183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea\": container with ID starting with 183f71d77b7a1768eee27187f0c1e4a893f5661fa42cf671332e081118ae7dea not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.860218 4922 scope.go:117] "RemoveContainer" containerID="091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.860714 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d"} err="failed to get container status \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": rpc error: code = NotFound desc = could not find container \"091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d\": container with ID starting with 091e9164e566847ab0605abc3b8f32027bcccff041c48fbf251d28690dcc132d not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.860746 4922 scope.go:117] "RemoveContainer" containerID="5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.861113 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7"} err="failed to get container status \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": rpc error: code = NotFound desc = could not find container \"5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7\": container with ID starting with 5603a0776e71297b57ae6a438254d6d8abb317252617fbe3b28707e9f96424d7 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.861150 4922 scope.go:117] "RemoveContainer" containerID="cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.861475 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f"} err="failed to get container status \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": rpc error: code = NotFound desc = could not find container \"cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f\": container with ID starting with cd701fe5abb8c240fb633df69fbda2e7fb1ac4c76195dae8dfd5b1db811c1d5f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.861500 4922 scope.go:117] "RemoveContainer" containerID="f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.861973 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f"} err="failed to get container status \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": rpc error: code = NotFound desc = could not find container \"f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f\": container with ID starting with f9e0fc46c7e9bbed62a72d78126ea7ffcb4de50dc048c8c0fd90bd0b0d8c718f not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.862015 4922 scope.go:117] "RemoveContainer" containerID="7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.863049 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9"} err="failed to get container status \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": rpc error: code = NotFound desc = could not find container \"7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9\": container with ID starting with 7854642894dce2837e59481c400522a24a2e66c3f9eb521a370e85b203288af9 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.863079 4922 scope.go:117] "RemoveContainer" containerID="18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.863436 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf"} err="failed to get container status \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": rpc error: code = NotFound desc = could not find container \"18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf\": container with ID starting with 18ceb46477edc2737cbdc850d6387ba175a07e85e69c988f91fc93def44fa1bf not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.863473 4922 scope.go:117] "RemoveContainer" containerID="4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.863866 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2"} err="failed to get container status \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": rpc error: code = NotFound desc = could not find container \"4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2\": container with ID starting with 4ae9afe0ea38ca15dcacd27f8a0eeb5459e7aad26e96aad8c046f60427fa5fe2 not found: ID does not exist" Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.886543 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c7wvg"] Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.896021 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c7wvg"] Nov 22 03:02:36 crc kubenswrapper[4922]: I1122 03:02:36.944516 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:37 crc kubenswrapper[4922]: I1122 03:02:37.332053 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a6bcd8-bb13-463b-b112-0df3cf90b5f7" path="/var/lib/kubelet/pods/c2a6bcd8-bb13-463b-b112-0df3cf90b5f7/volumes" Nov 22 03:02:37 crc kubenswrapper[4922]: I1122 03:02:37.560982 4922 generic.go:334] "Generic (PLEG): container finished" podID="5ae1edd6-6517-4fa6-8d5c-3aa64f62f387" containerID="89fb0420dfc32f07b95b80ff683562ed557e19ed09ecf6a458c81eace51393f7" exitCode=0 Nov 22 03:02:37 crc kubenswrapper[4922]: I1122 03:02:37.561143 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerDied","Data":"89fb0420dfc32f07b95b80ff683562ed557e19ed09ecf6a458c81eace51393f7"} Nov 22 03:02:37 crc kubenswrapper[4922]: I1122 03:02:37.561194 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"d4cd1ffc79246f23afd183d68e032908057a14aca9afc542c326c9893dabc74c"} Nov 22 03:02:37 crc kubenswrapper[4922]: I1122 03:02:37.564224 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/2.log" Nov 22 03:02:38 crc kubenswrapper[4922]: I1122 03:02:38.576796 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"b50fde560af5c477cd8363c0cd79ccf4fb69108beb1e54d4e1fb2766f9e42f7f"} Nov 22 03:02:38 crc kubenswrapper[4922]: I1122 03:02:38.577424 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"3782d83e0aff6ab8ee31a76f5e6770bc9bd933aa709441a64c7366e6529c2089"} Nov 22 03:02:38 crc kubenswrapper[4922]: I1122 03:02:38.577441 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"1c8399b4f5623b794df95a2500ba3acd28b41508bdd6409a4e119375388b5a8a"} Nov 22 03:02:38 crc kubenswrapper[4922]: I1122 03:02:38.577453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"9b1658119cf3c148f51f52acc4425b2906e4a9a6fea3f29a4f033ed686e1bc0a"} Nov 22 03:02:38 crc kubenswrapper[4922]: I1122 03:02:38.577466 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"9e95ec67babe447ce949a3e1366fc6ad898917a08214f8551e8e73b9da9f026d"} Nov 22 03:02:38 crc kubenswrapper[4922]: I1122 03:02:38.577481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"a83ddb9121de105a300d76c93868315e0b6aaf4fbe11e76cbfaa9ce49fc64a80"} Nov 22 03:02:41 crc kubenswrapper[4922]: I1122 03:02:41.611275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"a5b111d0f5fc2efda8f54c94c55e91c7fde9e36e860001c514a480c7de7f1733"} Nov 22 03:02:43 crc kubenswrapper[4922]: I1122 03:02:43.635982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" event={"ID":"5ae1edd6-6517-4fa6-8d5c-3aa64f62f387","Type":"ContainerStarted","Data":"6cc0d4dd253039f905655d4491785d353d263967c510ee9cc448884078c8e94c"} Nov 22 03:02:43 crc kubenswrapper[4922]: I1122 03:02:43.636828 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:43 crc kubenswrapper[4922]: I1122 03:02:43.681776 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" podStartSLOduration=7.681755549 podStartE2EDuration="7.681755549s" podCreationTimestamp="2025-11-22 03:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:02:43.679052034 +0000 UTC m=+599.717573936" watchObservedRunningTime="2025-11-22 03:02:43.681755549 +0000 UTC m=+599.720277441" Nov 22 03:02:43 crc kubenswrapper[4922]: I1122 03:02:43.689480 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:44 crc kubenswrapper[4922]: I1122 03:02:44.644345 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:44 crc kubenswrapper[4922]: I1122 03:02:44.644423 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:44 crc kubenswrapper[4922]: I1122 03:02:44.691641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:02:49 crc kubenswrapper[4922]: I1122 03:02:49.300915 4922 scope.go:117] "RemoveContainer" containerID="48c067afb41b86f3c7a21f1573ef0c9a87523ea9d2bc5bd76723c492a588b7a7" Nov 22 03:02:49 crc kubenswrapper[4922]: E1122 03:02:49.302460 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-d4gbc_openshift-multus(954bb7b8-d710-4e1a-973e-78c04e685f30)\"" pod="openshift-multus/multus-d4gbc" podUID="954bb7b8-d710-4e1a-973e-78c04e685f30" Nov 22 03:03:01 crc kubenswrapper[4922]: I1122 03:03:01.301834 4922 scope.go:117] "RemoveContainer" containerID="48c067afb41b86f3c7a21f1573ef0c9a87523ea9d2bc5bd76723c492a588b7a7" Nov 22 03:03:01 crc kubenswrapper[4922]: I1122 03:03:01.778754 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d4gbc_954bb7b8-d710-4e1a-973e-78c04e685f30/kube-multus/2.log" Nov 22 03:03:01 crc kubenswrapper[4922]: I1122 03:03:01.779303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d4gbc" event={"ID":"954bb7b8-d710-4e1a-973e-78c04e685f30","Type":"ContainerStarted","Data":"2ff9bfd938205f3d46910468770bc9b3df0dcd0c84dff199d894b495280e7ae9"} Nov 22 03:03:06 crc kubenswrapper[4922]: I1122 03:03:06.986541 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vlxn4" Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.810606 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx"] Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.812572 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.817043 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.832485 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx"] Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.948763 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xzx\" (UniqueName: \"kubernetes.io/projected/54673aca-5f82-42ac-91d8-036b789061dc-kube-api-access-68xzx\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.949491 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:18 crc kubenswrapper[4922]: I1122 03:03:18.949546 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.050733 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xzx\" (UniqueName: \"kubernetes.io/projected/54673aca-5f82-42ac-91d8-036b789061dc-kube-api-access-68xzx\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.050820 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.050930 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.051742 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.052243 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.086936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xzx\" (UniqueName: \"kubernetes.io/projected/54673aca-5f82-42ac-91d8-036b789061dc-kube-api-access-68xzx\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.141581 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.674622 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx"] Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.916243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerStarted","Data":"bdb22e39ed0ae255a19117a981136de27c56399f7eee1b912b178476f9bbb53e"} Nov 22 03:03:19 crc kubenswrapper[4922]: I1122 03:03:19.916355 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerStarted","Data":"505d87296bbee9d4234b9631780b789391d955690664c1cf8c100ef7325b3a40"} Nov 22 03:03:20 crc kubenswrapper[4922]: I1122 03:03:20.929201 4922 generic.go:334] "Generic (PLEG): container finished" podID="54673aca-5f82-42ac-91d8-036b789061dc" containerID="bdb22e39ed0ae255a19117a981136de27c56399f7eee1b912b178476f9bbb53e" exitCode=0 Nov 22 03:03:20 crc kubenswrapper[4922]: I1122 03:03:20.929337 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerDied","Data":"bdb22e39ed0ae255a19117a981136de27c56399f7eee1b912b178476f9bbb53e"} Nov 22 03:03:22 crc kubenswrapper[4922]: I1122 03:03:22.952248 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerStarted","Data":"f76a254b6fc56c82a4e9cff174280f797723f863769e28d293b450d1de37c44a"} Nov 22 03:03:23 crc kubenswrapper[4922]: I1122 03:03:23.967068 4922 generic.go:334] "Generic (PLEG): container finished" podID="54673aca-5f82-42ac-91d8-036b789061dc" containerID="f76a254b6fc56c82a4e9cff174280f797723f863769e28d293b450d1de37c44a" exitCode=0 Nov 22 03:03:23 crc kubenswrapper[4922]: I1122 03:03:23.967349 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerDied","Data":"f76a254b6fc56c82a4e9cff174280f797723f863769e28d293b450d1de37c44a"} Nov 22 03:03:24 crc kubenswrapper[4922]: I1122 03:03:24.978116 4922 generic.go:334] "Generic (PLEG): container finished" podID="54673aca-5f82-42ac-91d8-036b789061dc" containerID="6fcae6fade8b1f397cc74039e0e88e922386fd1c6855b5b959d601ca73bee609" exitCode=0 Nov 22 03:03:24 crc kubenswrapper[4922]: I1122 03:03:24.978256 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerDied","Data":"6fcae6fade8b1f397cc74039e0e88e922386fd1c6855b5b959d601ca73bee609"} Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.349648 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.468071 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xzx\" (UniqueName: \"kubernetes.io/projected/54673aca-5f82-42ac-91d8-036b789061dc-kube-api-access-68xzx\") pod \"54673aca-5f82-42ac-91d8-036b789061dc\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.468210 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-bundle\") pod \"54673aca-5f82-42ac-91d8-036b789061dc\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.468353 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-util\") pod \"54673aca-5f82-42ac-91d8-036b789061dc\" (UID: \"54673aca-5f82-42ac-91d8-036b789061dc\") " Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.470099 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-bundle" (OuterVolumeSpecName: "bundle") pod "54673aca-5f82-42ac-91d8-036b789061dc" (UID: "54673aca-5f82-42ac-91d8-036b789061dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.478013 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54673aca-5f82-42ac-91d8-036b789061dc-kube-api-access-68xzx" (OuterVolumeSpecName: "kube-api-access-68xzx") pod "54673aca-5f82-42ac-91d8-036b789061dc" (UID: "54673aca-5f82-42ac-91d8-036b789061dc"). InnerVolumeSpecName "kube-api-access-68xzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.490581 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-util" (OuterVolumeSpecName: "util") pod "54673aca-5f82-42ac-91d8-036b789061dc" (UID: "54673aca-5f82-42ac-91d8-036b789061dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.569965 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xzx\" (UniqueName: \"kubernetes.io/projected/54673aca-5f82-42ac-91d8-036b789061dc-kube-api-access-68xzx\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.570033 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:26 crc kubenswrapper[4922]: I1122 03:03:26.570052 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54673aca-5f82-42ac-91d8-036b789061dc-util\") on node \"crc\" DevicePath \"\"" Nov 22 03:03:27 crc kubenswrapper[4922]: I1122 03:03:27.034683 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" event={"ID":"54673aca-5f82-42ac-91d8-036b789061dc","Type":"ContainerDied","Data":"505d87296bbee9d4234b9631780b789391d955690664c1cf8c100ef7325b3a40"} Nov 22 03:03:27 crc kubenswrapper[4922]: I1122 03:03:27.034759 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="505d87296bbee9d4234b9631780b789391d955690664c1cf8c100ef7325b3a40" Nov 22 03:03:27 crc kubenswrapper[4922]: I1122 03:03:27.034793 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.410522 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xksq7"] Nov 22 03:03:30 crc kubenswrapper[4922]: E1122 03:03:30.411299 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="util" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.411323 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="util" Nov 22 03:03:30 crc kubenswrapper[4922]: E1122 03:03:30.411347 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="extract" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.411361 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="extract" Nov 22 03:03:30 crc kubenswrapper[4922]: E1122 03:03:30.411388 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="pull" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.411403 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="pull" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.411581 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="54673aca-5f82-42ac-91d8-036b789061dc" containerName="extract" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.412254 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.414683 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.414922 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-v57lf" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.415741 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.421257 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xksq7"] Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.529818 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7k7\" (UniqueName: \"kubernetes.io/projected/21bed758-674a-4d6f-9909-62147fd6d1b9-kube-api-access-9z7k7\") pod \"nmstate-operator-557fdffb88-xksq7\" (UID: \"21bed758-674a-4d6f-9909-62147fd6d1b9\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.631293 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7k7\" (UniqueName: \"kubernetes.io/projected/21bed758-674a-4d6f-9909-62147fd6d1b9-kube-api-access-9z7k7\") pod \"nmstate-operator-557fdffb88-xksq7\" (UID: \"21bed758-674a-4d6f-9909-62147fd6d1b9\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.653627 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7k7\" (UniqueName: \"kubernetes.io/projected/21bed758-674a-4d6f-9909-62147fd6d1b9-kube-api-access-9z7k7\") pod \"nmstate-operator-557fdffb88-xksq7\" (UID: \"21bed758-674a-4d6f-9909-62147fd6d1b9\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.734279 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" Nov 22 03:03:30 crc kubenswrapper[4922]: I1122 03:03:30.974468 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-xksq7"] Nov 22 03:03:31 crc kubenswrapper[4922]: I1122 03:03:31.063574 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" event={"ID":"21bed758-674a-4d6f-9909-62147fd6d1b9","Type":"ContainerStarted","Data":"59b7b39281a5c9be74d0beda0d2992686e5a18fdb2ef208d837c0029b9199c33"} Nov 22 03:03:34 crc kubenswrapper[4922]: I1122 03:03:34.083518 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" event={"ID":"21bed758-674a-4d6f-9909-62147fd6d1b9","Type":"ContainerStarted","Data":"0696133f4d3dfa5e53a939dcff3424b6a5495845abcbd52c5f937a83ecf92e96"} Nov 22 03:03:34 crc kubenswrapper[4922]: I1122 03:03:34.104791 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-xksq7" podStartSLOduration=1.899479778 podStartE2EDuration="4.104729174s" podCreationTimestamp="2025-11-22 03:03:30 +0000 UTC" firstStartedPulling="2025-11-22 03:03:30.983919093 +0000 UTC m=+647.022440985" lastFinishedPulling="2025-11-22 03:03:33.189168489 +0000 UTC m=+649.227690381" observedRunningTime="2025-11-22 03:03:34.104199112 +0000 UTC m=+650.142721044" watchObservedRunningTime="2025-11-22 03:03:34.104729174 +0000 UTC m=+650.143251096" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.141770 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.143633 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.145963 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mxzvr" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.155345 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.156752 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:39 crc kubenswrapper[4922]: W1122 03:03:39.158401 4922 reflector.go:561] object-"openshift-nmstate"/"openshift-nmstate-webhook": failed to list *v1.Secret: secrets "openshift-nmstate-webhook" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Nov 22 03:03:39 crc kubenswrapper[4922]: E1122 03:03:39.158530 4922 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"openshift-nmstate-webhook\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-nmstate-webhook\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.163475 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.186941 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.203368 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-f2mdj"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.204454 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.261679 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q8g2\" (UniqueName: \"kubernetes.io/projected/6d67e2c3-dc43-4809-bde2-1252e775b32d-kube-api-access-2q8g2\") pod \"nmstate-metrics-5dcf9c57c5-rb9pl\" (UID: \"6d67e2c3-dc43-4809-bde2-1252e775b32d\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.261743 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfvs\" (UniqueName: \"kubernetes.io/projected/9cdd2d56-0dc2-4e74-81ed-d22f94a88db9-kube-api-access-dlfvs\") pod \"nmstate-webhook-6b89b748d8-9d8bx\" (UID: \"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.261781 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9cdd2d56-0dc2-4e74-81ed-d22f94a88db9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-9d8bx\" (UID: \"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.322436 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.323210 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.325109 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.328524 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6dhk7" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.333989 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.340211 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363387 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9cdd2d56-0dc2-4e74-81ed-d22f94a88db9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-9d8bx\" (UID: \"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwff\" (UniqueName: \"kubernetes.io/projected/352d437e-9254-4db9-a771-fae8060c3c84-kube-api-access-8wwff\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-nmstate-lock\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-ovs-socket\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363694 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-dbus-socket\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363749 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q8g2\" (UniqueName: \"kubernetes.io/projected/6d67e2c3-dc43-4809-bde2-1252e775b32d-kube-api-access-2q8g2\") pod \"nmstate-metrics-5dcf9c57c5-rb9pl\" (UID: \"6d67e2c3-dc43-4809-bde2-1252e775b32d\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.363787 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfvs\" (UniqueName: \"kubernetes.io/projected/9cdd2d56-0dc2-4e74-81ed-d22f94a88db9-kube-api-access-dlfvs\") pod \"nmstate-webhook-6b89b748d8-9d8bx\" (UID: \"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.388254 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfvs\" (UniqueName: \"kubernetes.io/projected/9cdd2d56-0dc2-4e74-81ed-d22f94a88db9-kube-api-access-dlfvs\") pod \"nmstate-webhook-6b89b748d8-9d8bx\" (UID: \"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.388309 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q8g2\" (UniqueName: \"kubernetes.io/projected/6d67e2c3-dc43-4809-bde2-1252e775b32d-kube-api-access-2q8g2\") pod \"nmstate-metrics-5dcf9c57c5-rb9pl\" (UID: \"6d67e2c3-dc43-4809-bde2-1252e775b32d\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.464801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl28z\" (UniqueName: \"kubernetes.io/projected/24a6482e-6e30-42ca-9c56-ca6bb2772d41-kube-api-access-hl28z\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.464997 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-dbus-socket\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwff\" (UniqueName: \"kubernetes.io/projected/352d437e-9254-4db9-a771-fae8060c3c84-kube-api-access-8wwff\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465147 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a6482e-6e30-42ca-9c56-ca6bb2772d41-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-nmstate-lock\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465221 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-ovs-socket\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465260 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/24a6482e-6e30-42ca-9c56-ca6bb2772d41-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465394 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-nmstate-lock\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465451 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-ovs-socket\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.465564 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/352d437e-9254-4db9-a771-fae8060c3c84-dbus-socket\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.488333 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwff\" (UniqueName: \"kubernetes.io/projected/352d437e-9254-4db9-a771-fae8060c3c84-kube-api-access-8wwff\") pod \"nmstate-handler-f2mdj\" (UID: \"352d437e-9254-4db9-a771-fae8060c3c84\") " pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.496873 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.522962 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-644c4585f7-gkjm5"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.528823 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.534746 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.567466 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/24a6482e-6e30-42ca-9c56-ca6bb2772d41-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.567543 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl28z\" (UniqueName: \"kubernetes.io/projected/24a6482e-6e30-42ca-9c56-ca6bb2772d41-kube-api-access-hl28z\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.567647 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a6482e-6e30-42ca-9c56-ca6bb2772d41-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: E1122 03:03:39.567891 4922 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 22 03:03:39 crc kubenswrapper[4922]: E1122 03:03:39.567976 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24a6482e-6e30-42ca-9c56-ca6bb2772d41-plugin-serving-cert podName:24a6482e-6e30-42ca-9c56-ca6bb2772d41 nodeName:}" failed. No retries permitted until 2025-11-22 03:03:40.067950905 +0000 UTC m=+656.106472797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/24a6482e-6e30-42ca-9c56-ca6bb2772d41-plugin-serving-cert") pod "nmstate-console-plugin-5874bd7bc5-bb9qb" (UID: "24a6482e-6e30-42ca-9c56-ca6bb2772d41") : secret "plugin-serving-cert" not found Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.570161 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/24a6482e-6e30-42ca-9c56-ca6bb2772d41-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.594297 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-644c4585f7-gkjm5"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.599651 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl28z\" (UniqueName: \"kubernetes.io/projected/24a6482e-6e30-42ca-9c56-ca6bb2772d41-kube-api-access-hl28z\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.668593 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4nxw\" (UniqueName: \"kubernetes.io/projected/7b193d63-fb9f-456c-a0b7-8b2e404040a2-kube-api-access-b4nxw\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.668647 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-oauth-serving-cert\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.668836 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-oauth-config\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.668951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-trusted-ca-bundle\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.669010 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-serving-cert\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.669139 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-service-ca\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.669283 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-config\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.743403 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl"] Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771437 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4nxw\" (UniqueName: \"kubernetes.io/projected/7b193d63-fb9f-456c-a0b7-8b2e404040a2-kube-api-access-b4nxw\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771496 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-oauth-serving-cert\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-oauth-config\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771788 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-trusted-ca-bundle\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771812 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-serving-cert\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771874 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-service-ca\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.771911 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-config\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.773189 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-config\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.773340 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-trusted-ca-bundle\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.774009 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-service-ca\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.774315 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b193d63-fb9f-456c-a0b7-8b2e404040a2-oauth-serving-cert\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.776809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-serving-cert\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.777264 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b193d63-fb9f-456c-a0b7-8b2e404040a2-console-oauth-config\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.792990 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4nxw\" (UniqueName: \"kubernetes.io/projected/7b193d63-fb9f-456c-a0b7-8b2e404040a2-kube-api-access-b4nxw\") pod \"console-644c4585f7-gkjm5\" (UID: \"7b193d63-fb9f-456c-a0b7-8b2e404040a2\") " pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:39 crc kubenswrapper[4922]: I1122 03:03:39.899587 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.076175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a6482e-6e30-42ca-9c56-ca6bb2772d41-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.084027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/24a6482e-6e30-42ca-9c56-ca6bb2772d41-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-bb9qb\" (UID: \"24a6482e-6e30-42ca-9c56-ca6bb2772d41\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.132792 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-f2mdj" event={"ID":"352d437e-9254-4db9-a771-fae8060c3c84","Type":"ContainerStarted","Data":"7ef9b2e69d4d30046bb28c6cd0937ec47e243e3eef6146390cddc8d9890cb062"} Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.135798 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" event={"ID":"6d67e2c3-dc43-4809-bde2-1252e775b32d","Type":"ContainerStarted","Data":"174528b98ed985efd3a9b235986aefeb5eaf47d746a4468254ef8c60b174b23b"} Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.143148 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.149733 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9cdd2d56-0dc2-4e74-81ed-d22f94a88db9-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-9d8bx\" (UID: \"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.178866 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-644c4585f7-gkjm5"] Nov 22 03:03:40 crc kubenswrapper[4922]: W1122 03:03:40.186469 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b193d63_fb9f_456c_a0b7_8b2e404040a2.slice/crio-2fcaf85be827271f48c881c239f633b2bd01aed3d7f44afb8417cd0915e9605a WatchSource:0}: Error finding container 2fcaf85be827271f48c881c239f633b2bd01aed3d7f44afb8417cd0915e9605a: Status 404 returned error can't find the container with id 2fcaf85be827271f48c881c239f633b2bd01aed3d7f44afb8417cd0915e9605a Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.241404 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.404094 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.512867 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb"] Nov 22 03:03:40 crc kubenswrapper[4922]: W1122 03:03:40.529056 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a6482e_6e30_42ca_9c56_ca6bb2772d41.slice/crio-cb7804b7ae859174a6a978a05a09d7d463c3442ff26c9f4f825b9caa1831c702 WatchSource:0}: Error finding container cb7804b7ae859174a6a978a05a09d7d463c3442ff26c9f4f825b9caa1831c702: Status 404 returned error can't find the container with id cb7804b7ae859174a6a978a05a09d7d463c3442ff26c9f4f825b9caa1831c702 Nov 22 03:03:40 crc kubenswrapper[4922]: I1122 03:03:40.664295 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx"] Nov 22 03:03:40 crc kubenswrapper[4922]: W1122 03:03:40.669817 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cdd2d56_0dc2_4e74_81ed_d22f94a88db9.slice/crio-ac57f891c358dbf2f2b862389c74e8bfe56ad3631020bb78061824ed3b8abd63 WatchSource:0}: Error finding container ac57f891c358dbf2f2b862389c74e8bfe56ad3631020bb78061824ed3b8abd63: Status 404 returned error can't find the container with id ac57f891c358dbf2f2b862389c74e8bfe56ad3631020bb78061824ed3b8abd63 Nov 22 03:03:41 crc kubenswrapper[4922]: I1122 03:03:41.142685 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" event={"ID":"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9","Type":"ContainerStarted","Data":"ac57f891c358dbf2f2b862389c74e8bfe56ad3631020bb78061824ed3b8abd63"} Nov 22 03:03:41 crc kubenswrapper[4922]: I1122 03:03:41.145136 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" event={"ID":"24a6482e-6e30-42ca-9c56-ca6bb2772d41","Type":"ContainerStarted","Data":"cb7804b7ae859174a6a978a05a09d7d463c3442ff26c9f4f825b9caa1831c702"} Nov 22 03:03:41 crc kubenswrapper[4922]: I1122 03:03:41.148189 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-644c4585f7-gkjm5" event={"ID":"7b193d63-fb9f-456c-a0b7-8b2e404040a2","Type":"ContainerStarted","Data":"5fdee3726a38dba00bb166bb7dcf8e955a76d4418698dc203f32a62580c4519c"} Nov 22 03:03:41 crc kubenswrapper[4922]: I1122 03:03:41.148220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-644c4585f7-gkjm5" event={"ID":"7b193d63-fb9f-456c-a0b7-8b2e404040a2","Type":"ContainerStarted","Data":"2fcaf85be827271f48c881c239f633b2bd01aed3d7f44afb8417cd0915e9605a"} Nov 22 03:03:41 crc kubenswrapper[4922]: I1122 03:03:41.169420 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-644c4585f7-gkjm5" podStartSLOduration=2.169403548 podStartE2EDuration="2.169403548s" podCreationTimestamp="2025-11-22 03:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:03:41.16742571 +0000 UTC m=+657.205947622" watchObservedRunningTime="2025-11-22 03:03:41.169403548 +0000 UTC m=+657.207925440" Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.167099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" event={"ID":"9cdd2d56-0dc2-4e74-81ed-d22f94a88db9","Type":"ContainerStarted","Data":"fcc2172997de813be688184efa68f1a94b07415bdd3321af9dc4eb0fde648a1e"} Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.167906 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.170158 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" event={"ID":"6d67e2c3-dc43-4809-bde2-1252e775b32d","Type":"ContainerStarted","Data":"0cca5e8537833c62ace4beaa8e91e05982b61d4529f07885860e79872ff67e25"} Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.175990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-f2mdj" event={"ID":"352d437e-9254-4db9-a771-fae8060c3c84","Type":"ContainerStarted","Data":"86bb96afc82c252b247f0e08f3203e6adbd19036f43d81767a6e1bb5442fba44"} Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.176171 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.185601 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" podStartSLOduration=2.761076691 podStartE2EDuration="4.185574328s" podCreationTimestamp="2025-11-22 03:03:39 +0000 UTC" firstStartedPulling="2025-11-22 03:03:40.672435742 +0000 UTC m=+656.710957624" lastFinishedPulling="2025-11-22 03:03:42.096933369 +0000 UTC m=+658.135455261" observedRunningTime="2025-11-22 03:03:43.184236776 +0000 UTC m=+659.222758668" watchObservedRunningTime="2025-11-22 03:03:43.185574328 +0000 UTC m=+659.224096220" Nov 22 03:03:43 crc kubenswrapper[4922]: I1122 03:03:43.206443 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-f2mdj" podStartSLOduration=1.7152258759999999 podStartE2EDuration="4.206421297s" podCreationTimestamp="2025-11-22 03:03:39 +0000 UTC" firstStartedPulling="2025-11-22 03:03:39.600609686 +0000 UTC m=+655.639131578" lastFinishedPulling="2025-11-22 03:03:42.091805097 +0000 UTC m=+658.130326999" observedRunningTime="2025-11-22 03:03:43.204835969 +0000 UTC m=+659.243357881" watchObservedRunningTime="2025-11-22 03:03:43.206421297 +0000 UTC m=+659.244943199" Nov 22 03:03:44 crc kubenswrapper[4922]: I1122 03:03:44.186230 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" event={"ID":"24a6482e-6e30-42ca-9c56-ca6bb2772d41","Type":"ContainerStarted","Data":"04d96094efb25858fe087a616fe63e1820c784407319b268e4e26c4d55a3d635"} Nov 22 03:03:44 crc kubenswrapper[4922]: I1122 03:03:44.212658 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-bb9qb" podStartSLOduration=2.663439373 podStartE2EDuration="5.212639262s" podCreationTimestamp="2025-11-22 03:03:39 +0000 UTC" firstStartedPulling="2025-11-22 03:03:40.53572958 +0000 UTC m=+656.574251482" lastFinishedPulling="2025-11-22 03:03:43.084929459 +0000 UTC m=+659.123451371" observedRunningTime="2025-11-22 03:03:44.210139103 +0000 UTC m=+660.248661075" watchObservedRunningTime="2025-11-22 03:03:44.212639262 +0000 UTC m=+660.251161154" Nov 22 03:03:45 crc kubenswrapper[4922]: I1122 03:03:45.198779 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" event={"ID":"6d67e2c3-dc43-4809-bde2-1252e775b32d","Type":"ContainerStarted","Data":"c22706f8a94f50288f7c44449536eda01d6dc99448b62da9568b9de1084d9993"} Nov 22 03:03:45 crc kubenswrapper[4922]: I1122 03:03:45.230240 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-rb9pl" podStartSLOduration=1.332907345 podStartE2EDuration="6.230207739s" podCreationTimestamp="2025-11-22 03:03:39 +0000 UTC" firstStartedPulling="2025-11-22 03:03:39.754194163 +0000 UTC m=+655.792716065" lastFinishedPulling="2025-11-22 03:03:44.651494557 +0000 UTC m=+660.690016459" observedRunningTime="2025-11-22 03:03:45.228438957 +0000 UTC m=+661.266960849" watchObservedRunningTime="2025-11-22 03:03:45.230207739 +0000 UTC m=+661.268729671" Nov 22 03:03:49 crc kubenswrapper[4922]: I1122 03:03:49.572596 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-f2mdj" Nov 22 03:03:49 crc kubenswrapper[4922]: I1122 03:03:49.900537 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:49 crc kubenswrapper[4922]: I1122 03:03:49.900641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:49 crc kubenswrapper[4922]: I1122 03:03:49.907915 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:50 crc kubenswrapper[4922]: I1122 03:03:50.243104 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-644c4585f7-gkjm5" Nov 22 03:03:50 crc kubenswrapper[4922]: I1122 03:03:50.322025 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dj4jp"] Nov 22 03:04:00 crc kubenswrapper[4922]: I1122 03:04:00.416253 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-9d8bx" Nov 22 03:04:11 crc kubenswrapper[4922]: I1122 03:04:11.110096 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:04:11 crc kubenswrapper[4922]: I1122 03:04:11.110921 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.395675 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dj4jp" podUID="136fdcc5-9b23-442a-85e0-96129d4aed8a" containerName="console" containerID="cri-o://42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7" gracePeriod=15 Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.825129 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dj4jp_136fdcc5-9b23-442a-85e0-96129d4aed8a/console/0.log" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.825234 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.989542 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-oauth-serving-cert\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.990253 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-service-ca\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.990319 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-oauth-config\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.990390 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkvk9\" (UniqueName: \"kubernetes.io/projected/136fdcc5-9b23-442a-85e0-96129d4aed8a-kube-api-access-jkvk9\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.990450 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-config\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.990502 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-serving-cert\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.990552 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-trusted-ca-bundle\") pod \"136fdcc5-9b23-442a-85e0-96129d4aed8a\" (UID: \"136fdcc5-9b23-442a-85e0-96129d4aed8a\") " Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.991486 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-service-ca" (OuterVolumeSpecName: "service-ca") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.991624 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.991690 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.991970 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-config" (OuterVolumeSpecName: "console-config") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.997149 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136fdcc5-9b23-442a-85e0-96129d4aed8a-kube-api-access-jkvk9" (OuterVolumeSpecName: "kube-api-access-jkvk9") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "kube-api-access-jkvk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.997819 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:04:15 crc kubenswrapper[4922]: I1122 03:04:15.998158 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "136fdcc5-9b23-442a-85e0-96129d4aed8a" (UID: "136fdcc5-9b23-442a-85e0-96129d4aed8a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092710 4922 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092778 4922 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092792 4922 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092803 4922 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092815 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkvk9\" (UniqueName: \"kubernetes.io/projected/136fdcc5-9b23-442a-85e0-96129d4aed8a-kube-api-access-jkvk9\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092888 4922 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.092900 4922 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/136fdcc5-9b23-442a-85e0-96129d4aed8a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.465257 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dj4jp_136fdcc5-9b23-442a-85e0-96129d4aed8a/console/0.log" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.467277 4922 generic.go:334] "Generic (PLEG): container finished" podID="136fdcc5-9b23-442a-85e0-96129d4aed8a" containerID="42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7" exitCode=2 Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.467326 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dj4jp" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.467346 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dj4jp" event={"ID":"136fdcc5-9b23-442a-85e0-96129d4aed8a","Type":"ContainerDied","Data":"42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7"} Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.467749 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dj4jp" event={"ID":"136fdcc5-9b23-442a-85e0-96129d4aed8a","Type":"ContainerDied","Data":"8e2d5811f57236b93e366312e677d44ccabdad91a99fa9faff20af0552392f53"} Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.467774 4922 scope.go:117] "RemoveContainer" containerID="42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.507141 4922 scope.go:117] "RemoveContainer" containerID="42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7" Nov 22 03:04:16 crc kubenswrapper[4922]: E1122 03:04:16.507914 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7\": container with ID starting with 42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7 not found: ID does not exist" containerID="42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.507955 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7"} err="failed to get container status \"42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7\": rpc error: code = NotFound desc = could not find container \"42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7\": container with ID starting with 42a8e111b5238b1a069678986c8df0f21c1376b9fbebf98ac927c5df71f28cb7 not found: ID does not exist" Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.530889 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dj4jp"] Nov 22 03:04:16 crc kubenswrapper[4922]: I1122 03:04:16.538529 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dj4jp"] Nov 22 03:04:17 crc kubenswrapper[4922]: I1122 03:04:17.307724 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136fdcc5-9b23-442a-85e0-96129d4aed8a" path="/var/lib/kubelet/pods/136fdcc5-9b23-442a-85e0-96129d4aed8a/volumes" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.433218 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4"] Nov 22 03:04:18 crc kubenswrapper[4922]: E1122 03:04:18.433510 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136fdcc5-9b23-442a-85e0-96129d4aed8a" containerName="console" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.433526 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="136fdcc5-9b23-442a-85e0-96129d4aed8a" containerName="console" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.433663 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="136fdcc5-9b23-442a-85e0-96129d4aed8a" containerName="console" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.434677 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.437054 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.457041 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4"] Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.533130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dw2\" (UniqueName: \"kubernetes.io/projected/7405653e-5aef-4e96-9140-a011670ace50-kube-api-access-f7dw2\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.533220 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.533251 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.635560 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.635723 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dw2\" (UniqueName: \"kubernetes.io/projected/7405653e-5aef-4e96-9140-a011670ace50-kube-api-access-f7dw2\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.635815 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.636174 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.636361 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.669329 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dw2\" (UniqueName: \"kubernetes.io/projected/7405653e-5aef-4e96-9140-a011670ace50-kube-api-access-f7dw2\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:18 crc kubenswrapper[4922]: I1122 03:04:18.751607 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:19 crc kubenswrapper[4922]: I1122 03:04:19.032229 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4"] Nov 22 03:04:19 crc kubenswrapper[4922]: I1122 03:04:19.495035 4922 generic.go:334] "Generic (PLEG): container finished" podID="7405653e-5aef-4e96-9140-a011670ace50" containerID="9967dfe81cce90be62f397b9752904f5f794a84fa7376246b5a6fc347be5149b" exitCode=0 Nov 22 03:04:19 crc kubenswrapper[4922]: I1122 03:04:19.495151 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" event={"ID":"7405653e-5aef-4e96-9140-a011670ace50","Type":"ContainerDied","Data":"9967dfe81cce90be62f397b9752904f5f794a84fa7376246b5a6fc347be5149b"} Nov 22 03:04:19 crc kubenswrapper[4922]: I1122 03:04:19.497408 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" event={"ID":"7405653e-5aef-4e96-9140-a011670ace50","Type":"ContainerStarted","Data":"7b6be87405df72436d3ecd802772f77ece33935d6c961516c2675c8a5911a6fa"} Nov 22 03:04:21 crc kubenswrapper[4922]: I1122 03:04:21.521195 4922 generic.go:334] "Generic (PLEG): container finished" podID="7405653e-5aef-4e96-9140-a011670ace50" containerID="82b37d46b56c9704fa8a01d7edc80be871d23ea4dac5ec93cbaac21dc997d60d" exitCode=0 Nov 22 03:04:21 crc kubenswrapper[4922]: I1122 03:04:21.521377 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" event={"ID":"7405653e-5aef-4e96-9140-a011670ace50","Type":"ContainerDied","Data":"82b37d46b56c9704fa8a01d7edc80be871d23ea4dac5ec93cbaac21dc997d60d"} Nov 22 03:04:22 crc kubenswrapper[4922]: I1122 03:04:22.534870 4922 generic.go:334] "Generic (PLEG): container finished" podID="7405653e-5aef-4e96-9140-a011670ace50" containerID="0744d97b52871d2a6149ab232cf3ca4b6fcee5e6a2eb0a48c3ff7d9b12f1de45" exitCode=0 Nov 22 03:04:22 crc kubenswrapper[4922]: I1122 03:04:22.534947 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" event={"ID":"7405653e-5aef-4e96-9140-a011670ace50","Type":"ContainerDied","Data":"0744d97b52871d2a6149ab232cf3ca4b6fcee5e6a2eb0a48c3ff7d9b12f1de45"} Nov 22 03:04:23 crc kubenswrapper[4922]: I1122 03:04:23.900481 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.035905 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-util\") pod \"7405653e-5aef-4e96-9140-a011670ace50\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.036250 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-bundle\") pod \"7405653e-5aef-4e96-9140-a011670ace50\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.036374 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7dw2\" (UniqueName: \"kubernetes.io/projected/7405653e-5aef-4e96-9140-a011670ace50-kube-api-access-f7dw2\") pod \"7405653e-5aef-4e96-9140-a011670ace50\" (UID: \"7405653e-5aef-4e96-9140-a011670ace50\") " Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.037444 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-bundle" (OuterVolumeSpecName: "bundle") pod "7405653e-5aef-4e96-9140-a011670ace50" (UID: "7405653e-5aef-4e96-9140-a011670ace50"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.047548 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7405653e-5aef-4e96-9140-a011670ace50-kube-api-access-f7dw2" (OuterVolumeSpecName: "kube-api-access-f7dw2") pod "7405653e-5aef-4e96-9140-a011670ace50" (UID: "7405653e-5aef-4e96-9140-a011670ace50"). InnerVolumeSpecName "kube-api-access-f7dw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.051217 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-util" (OuterVolumeSpecName: "util") pod "7405653e-5aef-4e96-9140-a011670ace50" (UID: "7405653e-5aef-4e96-9140-a011670ace50"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.139326 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-util\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.139370 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7405653e-5aef-4e96-9140-a011670ace50-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.139381 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7dw2\" (UniqueName: \"kubernetes.io/projected/7405653e-5aef-4e96-9140-a011670ace50-kube-api-access-f7dw2\") on node \"crc\" DevicePath \"\"" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.590469 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" event={"ID":"7405653e-5aef-4e96-9140-a011670ace50","Type":"ContainerDied","Data":"7b6be87405df72436d3ecd802772f77ece33935d6c961516c2675c8a5911a6fa"} Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.591078 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b6be87405df72436d3ecd802772f77ece33935d6c961516c2675c8a5911a6fa" Nov 22 03:04:24 crc kubenswrapper[4922]: I1122 03:04:24.590700 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.809543 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj"] Nov 22 03:04:33 crc kubenswrapper[4922]: E1122 03:04:33.810489 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="util" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.810510 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="util" Nov 22 03:04:33 crc kubenswrapper[4922]: E1122 03:04:33.810531 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="extract" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.810540 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="extract" Nov 22 03:04:33 crc kubenswrapper[4922]: E1122 03:04:33.810553 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="pull" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.810562 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="pull" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.810696 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7405653e-5aef-4e96-9140-a011670ace50" containerName="extract" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.811183 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.813904 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.814118 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.817117 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.817367 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9md2t" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.817524 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.826266 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj"] Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.889779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/621d00a4-49ba-4725-b00b-72e6ed7521ad-webhook-cert\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.889862 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79pjl\" (UniqueName: \"kubernetes.io/projected/621d00a4-49ba-4725-b00b-72e6ed7521ad-kube-api-access-79pjl\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.889944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/621d00a4-49ba-4725-b00b-72e6ed7521ad-apiservice-cert\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.990619 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/621d00a4-49ba-4725-b00b-72e6ed7521ad-apiservice-cert\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.990678 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/621d00a4-49ba-4725-b00b-72e6ed7521ad-webhook-cert\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.990704 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79pjl\" (UniqueName: \"kubernetes.io/projected/621d00a4-49ba-4725-b00b-72e6ed7521ad-kube-api-access-79pjl\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.997105 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/621d00a4-49ba-4725-b00b-72e6ed7521ad-apiservice-cert\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:33 crc kubenswrapper[4922]: I1122 03:04:33.997563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/621d00a4-49ba-4725-b00b-72e6ed7521ad-webhook-cert\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.016593 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79pjl\" (UniqueName: \"kubernetes.io/projected/621d00a4-49ba-4725-b00b-72e6ed7521ad-kube-api-access-79pjl\") pod \"metallb-operator-controller-manager-7d967d77f6-659pj\" (UID: \"621d00a4-49ba-4725-b00b-72e6ed7521ad\") " pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.137615 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.257633 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx"] Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.259317 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.262987 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.263164 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.263302 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4lftb" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.296996 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-webhook-cert\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.297047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77pkq\" (UniqueName: \"kubernetes.io/projected/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-kube-api-access-77pkq\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.297114 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-apiservice-cert\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.320970 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx"] Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.398495 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-apiservice-cert\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.398616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-webhook-cert\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.398642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77pkq\" (UniqueName: \"kubernetes.io/projected/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-kube-api-access-77pkq\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.410649 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-webhook-cert\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.415258 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-apiservice-cert\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.418209 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77pkq\" (UniqueName: \"kubernetes.io/projected/a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9-kube-api-access-77pkq\") pod \"metallb-operator-webhook-server-748dcb78f6-gcbtx\" (UID: \"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9\") " pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.597336 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.644808 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj"] Nov 22 03:04:34 crc kubenswrapper[4922]: W1122 03:04:34.662989 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod621d00a4_49ba_4725_b00b_72e6ed7521ad.slice/crio-699f919d3deaad49f62cad416fd96fae6936fe397e73730a535ca4eacd46343a WatchSource:0}: Error finding container 699f919d3deaad49f62cad416fd96fae6936fe397e73730a535ca4eacd46343a: Status 404 returned error can't find the container with id 699f919d3deaad49f62cad416fd96fae6936fe397e73730a535ca4eacd46343a Nov 22 03:04:34 crc kubenswrapper[4922]: I1122 03:04:34.866200 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx"] Nov 22 03:04:34 crc kubenswrapper[4922]: W1122 03:04:34.876040 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7a3b01f_9dd6_43b2_8ef3_8a1443c1bfc9.slice/crio-5677db73c057f03dab51ca9fbad55c28825ce23b29c3f35e25645170b789e089 WatchSource:0}: Error finding container 5677db73c057f03dab51ca9fbad55c28825ce23b29c3f35e25645170b789e089: Status 404 returned error can't find the container with id 5677db73c057f03dab51ca9fbad55c28825ce23b29c3f35e25645170b789e089 Nov 22 03:04:35 crc kubenswrapper[4922]: I1122 03:04:35.663318 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" event={"ID":"621d00a4-49ba-4725-b00b-72e6ed7521ad","Type":"ContainerStarted","Data":"699f919d3deaad49f62cad416fd96fae6936fe397e73730a535ca4eacd46343a"} Nov 22 03:04:35 crc kubenswrapper[4922]: I1122 03:04:35.664776 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" event={"ID":"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9","Type":"ContainerStarted","Data":"5677db73c057f03dab51ca9fbad55c28825ce23b29c3f35e25645170b789e089"} Nov 22 03:04:40 crc kubenswrapper[4922]: I1122 03:04:40.708813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" event={"ID":"a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9","Type":"ContainerStarted","Data":"0efc16d19aec65f2b3f2125e572b8535d215964574d23098ab121817c419d8ca"} Nov 22 03:04:40 crc kubenswrapper[4922]: I1122 03:04:40.709398 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:04:40 crc kubenswrapper[4922]: I1122 03:04:40.711417 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" event={"ID":"621d00a4-49ba-4725-b00b-72e6ed7521ad","Type":"ContainerStarted","Data":"052a6572fcfbb4418edad4d90d271264d4c14d588e10335ec9acda0abdfc9df9"} Nov 22 03:04:40 crc kubenswrapper[4922]: I1122 03:04:40.711672 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:04:40 crc kubenswrapper[4922]: I1122 03:04:40.766097 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" podStartSLOduration=2.840877567 podStartE2EDuration="7.766075387s" podCreationTimestamp="2025-11-22 03:04:33 +0000 UTC" firstStartedPulling="2025-11-22 03:04:34.666462307 +0000 UTC m=+710.704984199" lastFinishedPulling="2025-11-22 03:04:39.591660117 +0000 UTC m=+715.630182019" observedRunningTime="2025-11-22 03:04:40.7629246 +0000 UTC m=+716.801446492" watchObservedRunningTime="2025-11-22 03:04:40.766075387 +0000 UTC m=+716.804597269" Nov 22 03:04:40 crc kubenswrapper[4922]: I1122 03:04:40.767352 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" podStartSLOduration=2.019543203 podStartE2EDuration="6.767347256s" podCreationTimestamp="2025-11-22 03:04:34 +0000 UTC" firstStartedPulling="2025-11-22 03:04:34.881462568 +0000 UTC m=+710.919984470" lastFinishedPulling="2025-11-22 03:04:39.629266631 +0000 UTC m=+715.667788523" observedRunningTime="2025-11-22 03:04:40.744276682 +0000 UTC m=+716.782798574" watchObservedRunningTime="2025-11-22 03:04:40.767347256 +0000 UTC m=+716.805869148" Nov 22 03:04:41 crc kubenswrapper[4922]: I1122 03:04:41.109585 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:04:41 crc kubenswrapper[4922]: I1122 03:04:41.109657 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:04:54 crc kubenswrapper[4922]: I1122 03:04:54.602732 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-748dcb78f6-gcbtx" Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.110363 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.111314 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.111405 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.112381 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3adf25f358b8b1181f3ac3e402fdb1299c491a8f833369cdc996bbafe94e841c"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.112481 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://3adf25f358b8b1181f3ac3e402fdb1299c491a8f833369cdc996bbafe94e841c" gracePeriod=600 Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.928810 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="3adf25f358b8b1181f3ac3e402fdb1299c491a8f833369cdc996bbafe94e841c" exitCode=0 Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.928912 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"3adf25f358b8b1181f3ac3e402fdb1299c491a8f833369cdc996bbafe94e841c"} Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.929609 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"e544a691b867a653faf01710d7aaa2e43b953be45a75dda707c262af6da21812"} Nov 22 03:05:11 crc kubenswrapper[4922]: I1122 03:05:11.929840 4922 scope.go:117] "RemoveContainer" containerID="04bb324b7ce09f87535d27cd3f61d191662bc844fa3e27e67f963dd00d9c92fb" Nov 22 03:05:14 crc kubenswrapper[4922]: I1122 03:05:14.143569 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d967d77f6-659pj" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.133038 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.134299 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.135899 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bjqlk"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.137882 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.138283 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.138414 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-55gv9" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.141362 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.141393 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.158343 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177132 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85602553-aa6e-40ba-b92b-96b851a002ca-metrics-certs\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177198 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-frr-sockets\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-metrics\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177362 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06b9437-16f5-486b-88a2-1c475e99e21a-cert\") pod \"frr-k8s-webhook-server-6998585d5-8zlpf\" (UID: \"f06b9437-16f5-486b-88a2-1c475e99e21a\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177428 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgp7\" (UniqueName: \"kubernetes.io/projected/85602553-aa6e-40ba-b92b-96b851a002ca-kube-api-access-hvgp7\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177465 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnq9c\" (UniqueName: \"kubernetes.io/projected/f06b9437-16f5-486b-88a2-1c475e99e21a-kube-api-access-bnq9c\") pod \"frr-k8s-webhook-server-6998585d5-8zlpf\" (UID: \"f06b9437-16f5-486b-88a2-1c475e99e21a\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177535 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-reloader\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177582 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/85602553-aa6e-40ba-b92b-96b851a002ca-frr-startup\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.177634 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-frr-conf\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278511 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-metrics\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06b9437-16f5-486b-88a2-1c475e99e21a-cert\") pod \"frr-k8s-webhook-server-6998585d5-8zlpf\" (UID: \"f06b9437-16f5-486b-88a2-1c475e99e21a\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278863 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgp7\" (UniqueName: \"kubernetes.io/projected/85602553-aa6e-40ba-b92b-96b851a002ca-kube-api-access-hvgp7\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278887 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnq9c\" (UniqueName: \"kubernetes.io/projected/f06b9437-16f5-486b-88a2-1c475e99e21a-kube-api-access-bnq9c\") pod \"frr-k8s-webhook-server-6998585d5-8zlpf\" (UID: \"f06b9437-16f5-486b-88a2-1c475e99e21a\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278912 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-reloader\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278937 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/85602553-aa6e-40ba-b92b-96b851a002ca-frr-startup\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.278954 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-frr-conf\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.279006 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85602553-aa6e-40ba-b92b-96b851a002ca-metrics-certs\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.279022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-frr-sockets\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.279412 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-metrics\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.279568 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-reloader\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.279624 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-frr-conf\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.280109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/85602553-aa6e-40ba-b92b-96b851a002ca-frr-sockets\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.280238 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/85602553-aa6e-40ba-b92b-96b851a002ca-frr-startup\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.291707 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/85602553-aa6e-40ba-b92b-96b851a002ca-metrics-certs\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.293427 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06b9437-16f5-486b-88a2-1c475e99e21a-cert\") pod \"frr-k8s-webhook-server-6998585d5-8zlpf\" (UID: \"f06b9437-16f5-486b-88a2-1c475e99e21a\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.301801 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnq9c\" (UniqueName: \"kubernetes.io/projected/f06b9437-16f5-486b-88a2-1c475e99e21a-kube-api-access-bnq9c\") pod \"frr-k8s-webhook-server-6998585d5-8zlpf\" (UID: \"f06b9437-16f5-486b-88a2-1c475e99e21a\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.307385 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgp7\" (UniqueName: \"kubernetes.io/projected/85602553-aa6e-40ba-b92b-96b851a002ca-kube-api-access-hvgp7\") pod \"frr-k8s-bjqlk\" (UID: \"85602553-aa6e-40ba-b92b-96b851a002ca\") " pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.310010 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tl5mq"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.311359 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.313267 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.313689 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.313858 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.314007 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-zwwxh" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.346968 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-h2kgl"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.347877 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.349644 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.377865 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-h2kgl"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-cert\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380282 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq4p7\" (UniqueName: \"kubernetes.io/projected/513892d3-9d82-48cf-911c-857d8c2a8a95-kube-api-access-tq4p7\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380301 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-metrics-certs\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380344 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-metrics-certs\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380396 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f607a020-54f7-4888-8a09-6caede7a160c-metallb-excludel2\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.380416 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w7t\" (UniqueName: \"kubernetes.io/projected/f607a020-54f7-4888-8a09-6caede7a160c-kube-api-access-h2w7t\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.453618 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.463240 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.481807 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f607a020-54f7-4888-8a09-6caede7a160c-metallb-excludel2\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.481876 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w7t\" (UniqueName: \"kubernetes.io/projected/f607a020-54f7-4888-8a09-6caede7a160c-kube-api-access-h2w7t\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.481928 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-cert\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.481961 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.481996 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq4p7\" (UniqueName: \"kubernetes.io/projected/513892d3-9d82-48cf-911c-857d8c2a8a95-kube-api-access-tq4p7\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.482019 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-metrics-certs\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.482064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-metrics-certs\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: E1122 03:05:15.482256 4922 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Nov 22 03:05:15 crc kubenswrapper[4922]: E1122 03:05:15.482340 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-metrics-certs podName:513892d3-9d82-48cf-911c-857d8c2a8a95 nodeName:}" failed. No retries permitted until 2025-11-22 03:05:15.982311847 +0000 UTC m=+752.020833739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-metrics-certs") pod "controller-6c7b4b5f48-h2kgl" (UID: "513892d3-9d82-48cf-911c-857d8c2a8a95") : secret "controller-certs-secret" not found Nov 22 03:05:15 crc kubenswrapper[4922]: E1122 03:05:15.482987 4922 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 03:05:15 crc kubenswrapper[4922]: E1122 03:05:15.483115 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist podName:f607a020-54f7-4888-8a09-6caede7a160c nodeName:}" failed. No retries permitted until 2025-11-22 03:05:15.983087136 +0000 UTC m=+752.021609028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist") pod "speaker-tl5mq" (UID: "f607a020-54f7-4888-8a09-6caede7a160c") : secret "metallb-memberlist" not found Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.483577 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f607a020-54f7-4888-8a09-6caede7a160c-metallb-excludel2\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.486093 4922 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.491542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-metrics-certs\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.500486 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq4p7\" (UniqueName: \"kubernetes.io/projected/513892d3-9d82-48cf-911c-857d8c2a8a95-kube-api-access-tq4p7\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.501252 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-cert\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.503952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w7t\" (UniqueName: \"kubernetes.io/projected/f607a020-54f7-4888-8a09-6caede7a160c-kube-api-access-h2w7t\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: W1122 03:05:15.777519 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf06b9437_16f5_486b_88a2_1c475e99e21a.slice/crio-65c6a3d00c089b5403f018cef007d5c8a52836afd98be4cf3bdbfe5af198c370 WatchSource:0}: Error finding container 65c6a3d00c089b5403f018cef007d5c8a52836afd98be4cf3bdbfe5af198c370: Status 404 returned error can't find the container with id 65c6a3d00c089b5403f018cef007d5c8a52836afd98be4cf3bdbfe5af198c370 Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.788166 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf"] Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.964371 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"e41a5067c1701207cbb6f33c1184556db1b599effd1adb80a36914acf7447ceb"} Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.966250 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" event={"ID":"f06b9437-16f5-486b-88a2-1c475e99e21a","Type":"ContainerStarted","Data":"65c6a3d00c089b5403f018cef007d5c8a52836afd98be4cf3bdbfe5af198c370"} Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.989462 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.990504 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-metrics-certs\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:15 crc kubenswrapper[4922]: E1122 03:05:15.989780 4922 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 03:05:15 crc kubenswrapper[4922]: E1122 03:05:15.991110 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist podName:f607a020-54f7-4888-8a09-6caede7a160c nodeName:}" failed. No retries permitted until 2025-11-22 03:05:16.991072481 +0000 UTC m=+753.029594373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist") pod "speaker-tl5mq" (UID: "f607a020-54f7-4888-8a09-6caede7a160c") : secret "metallb-memberlist" not found Nov 22 03:05:15 crc kubenswrapper[4922]: I1122 03:05:15.999807 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/513892d3-9d82-48cf-911c-857d8c2a8a95-metrics-certs\") pod \"controller-6c7b4b5f48-h2kgl\" (UID: \"513892d3-9d82-48cf-911c-857d8c2a8a95\") " pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.218997 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsfk7"] Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.219673 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" podUID="9507632c-3232-44dc-a75f-1275a2f57145" containerName="controller-manager" containerID="cri-o://18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6" gracePeriod=30 Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.261438 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.327037 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc"] Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.327787 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" podUID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" containerName="route-controller-manager" containerID="cri-o://84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89" gracePeriod=30 Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.679311 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-h2kgl"] Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.757431 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.803141 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbj65\" (UniqueName: \"kubernetes.io/projected/9507632c-3232-44dc-a75f-1275a2f57145-kube-api-access-fbj65\") pod \"9507632c-3232-44dc-a75f-1275a2f57145\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.803185 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-proxy-ca-bundles\") pod \"9507632c-3232-44dc-a75f-1275a2f57145\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.803225 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-client-ca\") pod \"9507632c-3232-44dc-a75f-1275a2f57145\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.803249 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-config\") pod \"9507632c-3232-44dc-a75f-1275a2f57145\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.803281 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9507632c-3232-44dc-a75f-1275a2f57145-serving-cert\") pod \"9507632c-3232-44dc-a75f-1275a2f57145\" (UID: \"9507632c-3232-44dc-a75f-1275a2f57145\") " Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.807713 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-config" (OuterVolumeSpecName: "config") pod "9507632c-3232-44dc-a75f-1275a2f57145" (UID: "9507632c-3232-44dc-a75f-1275a2f57145"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.808543 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9507632c-3232-44dc-a75f-1275a2f57145" (UID: "9507632c-3232-44dc-a75f-1275a2f57145"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.809275 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-client-ca" (OuterVolumeSpecName: "client-ca") pod "9507632c-3232-44dc-a75f-1275a2f57145" (UID: "9507632c-3232-44dc-a75f-1275a2f57145"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.813310 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9507632c-3232-44dc-a75f-1275a2f57145-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9507632c-3232-44dc-a75f-1275a2f57145" (UID: "9507632c-3232-44dc-a75f-1275a2f57145"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.826352 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9507632c-3232-44dc-a75f-1275a2f57145-kube-api-access-fbj65" (OuterVolumeSpecName: "kube-api-access-fbj65") pod "9507632c-3232-44dc-a75f-1275a2f57145" (UID: "9507632c-3232-44dc-a75f-1275a2f57145"). InnerVolumeSpecName "kube-api-access-fbj65". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.906208 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.906238 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9507632c-3232-44dc-a75f-1275a2f57145-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.906248 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbj65\" (UniqueName: \"kubernetes.io/projected/9507632c-3232-44dc-a75f-1275a2f57145-kube-api-access-fbj65\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.906260 4922 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.906270 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9507632c-3232-44dc-a75f-1275a2f57145-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:16 crc kubenswrapper[4922]: I1122 03:05:16.952276 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.008264 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-serving-cert\") pod \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.008338 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-client-ca\") pod \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.008449 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mqkh\" (UniqueName: \"kubernetes.io/projected/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-kube-api-access-2mqkh\") pod \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.008469 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-config\") pod \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\" (UID: \"151b3d80-db1e-4af2-aa89-b38d9cfe8bea\") " Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.008655 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:17 crc kubenswrapper[4922]: E1122 03:05:17.008803 4922 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 03:05:17 crc kubenswrapper[4922]: E1122 03:05:17.008876 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist podName:f607a020-54f7-4888-8a09-6caede7a160c nodeName:}" failed. No retries permitted until 2025-11-22 03:05:19.008861323 +0000 UTC m=+755.047383215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist") pod "speaker-tl5mq" (UID: "f607a020-54f7-4888-8a09-6caede7a160c") : secret "metallb-memberlist" not found Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.011343 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-client-ca" (OuterVolumeSpecName: "client-ca") pod "151b3d80-db1e-4af2-aa89-b38d9cfe8bea" (UID: "151b3d80-db1e-4af2-aa89-b38d9cfe8bea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.011713 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-config" (OuterVolumeSpecName: "config") pod "151b3d80-db1e-4af2-aa89-b38d9cfe8bea" (UID: "151b3d80-db1e-4af2-aa89-b38d9cfe8bea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.013453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-h2kgl" event={"ID":"513892d3-9d82-48cf-911c-857d8c2a8a95","Type":"ContainerStarted","Data":"b6424a13f18c36801b68aaa4b45074a19e21031768b97da5757d0bf99b280779"} Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.030240 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-kube-api-access-2mqkh" (OuterVolumeSpecName: "kube-api-access-2mqkh") pod "151b3d80-db1e-4af2-aa89-b38d9cfe8bea" (UID: "151b3d80-db1e-4af2-aa89-b38d9cfe8bea"). InnerVolumeSpecName "kube-api-access-2mqkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.030464 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "151b3d80-db1e-4af2-aa89-b38d9cfe8bea" (UID: "151b3d80-db1e-4af2-aa89-b38d9cfe8bea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.037133 4922 generic.go:334] "Generic (PLEG): container finished" podID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" containerID="84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89" exitCode=0 Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.037290 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.037689 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" event={"ID":"151b3d80-db1e-4af2-aa89-b38d9cfe8bea","Type":"ContainerDied","Data":"84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89"} Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.037759 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc" event={"ID":"151b3d80-db1e-4af2-aa89-b38d9cfe8bea","Type":"ContainerDied","Data":"d77a2a86629edb810209eef0d767251a7b24ea6e2e2d2ee86327bd96b13e921d"} Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.037782 4922 scope.go:117] "RemoveContainer" containerID="84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.071027 4922 generic.go:334] "Generic (PLEG): container finished" podID="9507632c-3232-44dc-a75f-1275a2f57145" containerID="18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6" exitCode=0 Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.071105 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" event={"ID":"9507632c-3232-44dc-a75f-1275a2f57145","Type":"ContainerDied","Data":"18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6"} Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.071145 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" event={"ID":"9507632c-3232-44dc-a75f-1275a2f57145","Type":"ContainerDied","Data":"3a9d03f00ccfa0c8348c2c7ae6521c789c76fba3a3b7f1e5e9d14673f9cb31ff"} Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.071256 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gsfk7" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.078715 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc"] Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.087730 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ksllc"] Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.118126 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.118410 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mqkh\" (UniqueName: \"kubernetes.io/projected/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-kube-api-access-2mqkh\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.118489 4922 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.118682 4922 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/151b3d80-db1e-4af2-aa89-b38d9cfe8bea-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.120663 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsfk7"] Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.127311 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gsfk7"] Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.139628 4922 scope.go:117] "RemoveContainer" containerID="84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89" Nov 22 03:05:17 crc kubenswrapper[4922]: E1122 03:05:17.140220 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89\": container with ID starting with 84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89 not found: ID does not exist" containerID="84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.140250 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89"} err="failed to get container status \"84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89\": rpc error: code = NotFound desc = could not find container \"84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89\": container with ID starting with 84459f42aba76e51fd9522465254ff5e0b701eb8278c825585d2ee3189401e89 not found: ID does not exist" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.140269 4922 scope.go:117] "RemoveContainer" containerID="18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.159498 4922 scope.go:117] "RemoveContainer" containerID="18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6" Nov 22 03:05:17 crc kubenswrapper[4922]: E1122 03:05:17.159998 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6\": container with ID starting with 18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6 not found: ID does not exist" containerID="18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.160066 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6"} err="failed to get container status \"18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6\": rpc error: code = NotFound desc = could not find container \"18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6\": container with ID starting with 18783c8d841393fb9721bbcccf0299b510b29642c78facd37d6c516cab21b1f6 not found: ID does not exist" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.313523 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" path="/var/lib/kubelet/pods/151b3d80-db1e-4af2-aa89-b38d9cfe8bea/volumes" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.314128 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9507632c-3232-44dc-a75f-1275a2f57145" path="/var/lib/kubelet/pods/9507632c-3232-44dc-a75f-1275a2f57145/volumes" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.997747 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk"] Nov 22 03:05:17 crc kubenswrapper[4922]: E1122 03:05:17.998321 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9507632c-3232-44dc-a75f-1275a2f57145" containerName="controller-manager" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.998334 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9507632c-3232-44dc-a75f-1275a2f57145" containerName="controller-manager" Nov 22 03:05:17 crc kubenswrapper[4922]: E1122 03:05:17.998343 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" containerName="route-controller-manager" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.998350 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" containerName="route-controller-manager" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.998443 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9507632c-3232-44dc-a75f-1275a2f57145" containerName="controller-manager" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.998459 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="151b3d80-db1e-4af2-aa89-b38d9cfe8bea" containerName="route-controller-manager" Nov 22 03:05:17 crc kubenswrapper[4922]: I1122 03:05:17.998871 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.000594 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.000694 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.000947 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.001698 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.001915 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.002033 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.011921 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk"] Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.031552 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca74b073-fdaf-41f9-9659-97c38e486fb8-config\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.031624 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca74b073-fdaf-41f9-9659-97c38e486fb8-client-ca\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.031644 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca74b073-fdaf-41f9-9659-97c38e486fb8-serving-cert\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.031698 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bx7\" (UniqueName: \"kubernetes.io/projected/ca74b073-fdaf-41f9-9659-97c38e486fb8-kube-api-access-t8bx7\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.086449 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-h2kgl" event={"ID":"513892d3-9d82-48cf-911c-857d8c2a8a95","Type":"ContainerStarted","Data":"9543489a507e03f58f10ad251a2e1b4449126a7be7789796808ad1e29bc5bab3"} Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.086499 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-h2kgl" event={"ID":"513892d3-9d82-48cf-911c-857d8c2a8a95","Type":"ContainerStarted","Data":"40404b25a21a2637b62b3d10a15394d0e684dcb82fd3e78c37f6f9ceb05c545b"} Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.087354 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.114159 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-h2kgl" podStartSLOduration=3.114113799 podStartE2EDuration="3.114113799s" podCreationTimestamp="2025-11-22 03:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:05:18.10998406 +0000 UTC m=+754.148505952" watchObservedRunningTime="2025-11-22 03:05:18.114113799 +0000 UTC m=+754.152635691" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.132786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca74b073-fdaf-41f9-9659-97c38e486fb8-client-ca\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.132836 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca74b073-fdaf-41f9-9659-97c38e486fb8-serving-cert\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.132915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bx7\" (UniqueName: \"kubernetes.io/projected/ca74b073-fdaf-41f9-9659-97c38e486fb8-kube-api-access-t8bx7\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.132966 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca74b073-fdaf-41f9-9659-97c38e486fb8-config\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.133645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ca74b073-fdaf-41f9-9659-97c38e486fb8-client-ca\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.133993 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca74b073-fdaf-41f9-9659-97c38e486fb8-config\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.143426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca74b073-fdaf-41f9-9659-97c38e486fb8-serving-cert\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.156129 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bx7\" (UniqueName: \"kubernetes.io/projected/ca74b073-fdaf-41f9-9659-97c38e486fb8-kube-api-access-t8bx7\") pod \"route-controller-manager-659dcd8f67-7kqpk\" (UID: \"ca74b073-fdaf-41f9-9659-97c38e486fb8\") " pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.257920 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc"] Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.258603 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.284319 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.284482 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.284521 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.284623 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.284691 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.287184 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.289007 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc"] Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.293940 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.314234 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.340579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdzg\" (UniqueName: \"kubernetes.io/projected/51cdc918-9c5c-4ac8-9a26-00f220273177-kube-api-access-zqdzg\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.340630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-config\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.340657 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-client-ca\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.340700 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdc918-9c5c-4ac8-9a26-00f220273177-serving-cert\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.340756 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-proxy-ca-bundles\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.441144 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdzg\" (UniqueName: \"kubernetes.io/projected/51cdc918-9c5c-4ac8-9a26-00f220273177-kube-api-access-zqdzg\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.441454 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-config\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.441550 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-client-ca\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.441605 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdc918-9c5c-4ac8-9a26-00f220273177-serving-cert\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.441656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-proxy-ca-bundles\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.443302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-config\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.451669 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-client-ca\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.463539 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51cdc918-9c5c-4ac8-9a26-00f220273177-proxy-ca-bundles\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.481904 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdzg\" (UniqueName: \"kubernetes.io/projected/51cdc918-9c5c-4ac8-9a26-00f220273177-kube-api-access-zqdzg\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.482134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51cdc918-9c5c-4ac8-9a26-00f220273177-serving-cert\") pod \"controller-manager-7fdd8f6c6c-c5zxc\" (UID: \"51cdc918-9c5c-4ac8-9a26-00f220273177\") " pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.611099 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.865571 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc"] Nov 22 03:05:18 crc kubenswrapper[4922]: W1122 03:05:18.882141 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51cdc918_9c5c_4ac8_9a26_00f220273177.slice/crio-7cd73b09e9477e1f234567fe66c0c894e6262b166c82695e5a3c5a0608d755a5 WatchSource:0}: Error finding container 7cd73b09e9477e1f234567fe66c0c894e6262b166c82695e5a3c5a0608d755a5: Status 404 returned error can't find the container with id 7cd73b09e9477e1f234567fe66c0c894e6262b166c82695e5a3c5a0608d755a5 Nov 22 03:05:18 crc kubenswrapper[4922]: I1122 03:05:18.910115 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk"] Nov 22 03:05:18 crc kubenswrapper[4922]: W1122 03:05:18.935564 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca74b073_fdaf_41f9_9659_97c38e486fb8.slice/crio-b8b0672b097e9b87bb6ca80ca924a8aaa72255b02d12bdad65c76982ef157292 WatchSource:0}: Error finding container b8b0672b097e9b87bb6ca80ca924a8aaa72255b02d12bdad65c76982ef157292: Status 404 returned error can't find the container with id b8b0672b097e9b87bb6ca80ca924a8aaa72255b02d12bdad65c76982ef157292 Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.054267 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.068753 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f607a020-54f7-4888-8a09-6caede7a160c-memberlist\") pod \"speaker-tl5mq\" (UID: \"f607a020-54f7-4888-8a09-6caede7a160c\") " pod="metallb-system/speaker-tl5mq" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.107263 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" event={"ID":"51cdc918-9c5c-4ac8-9a26-00f220273177","Type":"ContainerStarted","Data":"12a3170b308ef0f85e75e4a5b85ac8a96197e9274f77d4f5535627ed316f371a"} Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.107314 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" event={"ID":"51cdc918-9c5c-4ac8-9a26-00f220273177","Type":"ContainerStarted","Data":"7cd73b09e9477e1f234567fe66c0c894e6262b166c82695e5a3c5a0608d755a5"} Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.107484 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.110631 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" event={"ID":"ca74b073-fdaf-41f9-9659-97c38e486fb8","Type":"ContainerStarted","Data":"dfac23ec80837738a0ac73084b1966e62dee0fa5a4a9093514fc8ccdff505b84"} Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.110670 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" event={"ID":"ca74b073-fdaf-41f9-9659-97c38e486fb8","Type":"ContainerStarted","Data":"b8b0672b097e9b87bb6ca80ca924a8aaa72255b02d12bdad65c76982ef157292"} Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.110773 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.113104 4922 patch_prober.go:28] interesting pod/controller-manager-7fdd8f6c6c-c5zxc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.113176 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" podUID="51cdc918-9c5c-4ac8-9a26-00f220273177" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.113429 4922 patch_prober.go:28] interesting pod/route-controller-manager-659dcd8f67-7kqpk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.113452 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" podUID="ca74b073-fdaf-41f9-9659-97c38e486fb8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.124348 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" podStartSLOduration=3.124339671 podStartE2EDuration="3.124339671s" podCreationTimestamp="2025-11-22 03:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:05:19.122045696 +0000 UTC m=+755.160567588" watchObservedRunningTime="2025-11-22 03:05:19.124339671 +0000 UTC m=+755.162861563" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.148259 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" podStartSLOduration=2.148243426 podStartE2EDuration="2.148243426s" podCreationTimestamp="2025-11-22 03:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:05:19.143128383 +0000 UTC m=+755.181650265" watchObservedRunningTime="2025-11-22 03:05:19.148243426 +0000 UTC m=+755.186765318" Nov 22 03:05:19 crc kubenswrapper[4922]: I1122 03:05:19.253924 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tl5mq" Nov 22 03:05:19 crc kubenswrapper[4922]: W1122 03:05:19.289605 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf607a020_54f7_4888_8a09_6caede7a160c.slice/crio-c9b52713375aeeaf1d20103f438514d6106eb6379d296f4b1e509f747ed34ca5 WatchSource:0}: Error finding container c9b52713375aeeaf1d20103f438514d6106eb6379d296f4b1e509f747ed34ca5: Status 404 returned error can't find the container with id c9b52713375aeeaf1d20103f438514d6106eb6379d296f4b1e509f747ed34ca5 Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.121459 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tl5mq" event={"ID":"f607a020-54f7-4888-8a09-6caede7a160c","Type":"ContainerStarted","Data":"650e7c46098ad6cd19abff2ddf2cdb3efb2c7012dee46f00408d94f04d6991f5"} Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.122462 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tl5mq" event={"ID":"f607a020-54f7-4888-8a09-6caede7a160c","Type":"ContainerStarted","Data":"c74b05e539800ba32d248d42f2b86da8b25df837c494487eeb95009ac0c3a262"} Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.122477 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tl5mq" event={"ID":"f607a020-54f7-4888-8a09-6caede7a160c","Type":"ContainerStarted","Data":"c9b52713375aeeaf1d20103f438514d6106eb6379d296f4b1e509f747ed34ca5"} Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.122640 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tl5mq" Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.127696 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fdd8f6c6c-c5zxc" Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.132867 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-659dcd8f67-7kqpk" Nov 22 03:05:20 crc kubenswrapper[4922]: I1122 03:05:20.158875 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tl5mq" podStartSLOduration=5.158860906 podStartE2EDuration="5.158860906s" podCreationTimestamp="2025-11-22 03:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:05:20.157907784 +0000 UTC m=+756.196429676" watchObservedRunningTime="2025-11-22 03:05:20.158860906 +0000 UTC m=+756.197382798" Nov 22 03:05:25 crc kubenswrapper[4922]: I1122 03:05:25.165569 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" event={"ID":"f06b9437-16f5-486b-88a2-1c475e99e21a","Type":"ContainerStarted","Data":"27b88b7f54dcf3f3598584d4371121829f93ec0ca9602068e5fc48b391b3cd55"} Nov 22 03:05:25 crc kubenswrapper[4922]: I1122 03:05:25.165937 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:25 crc kubenswrapper[4922]: I1122 03:05:25.167475 4922 generic.go:334] "Generic (PLEG): container finished" podID="85602553-aa6e-40ba-b92b-96b851a002ca" containerID="3fa86d9cfe8581628c440a5160474d9c36efeab3804548f14bd287ee6b4b971c" exitCode=0 Nov 22 03:05:25 crc kubenswrapper[4922]: I1122 03:05:25.167513 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerDied","Data":"3fa86d9cfe8581628c440a5160474d9c36efeab3804548f14bd287ee6b4b971c"} Nov 22 03:05:25 crc kubenswrapper[4922]: I1122 03:05:25.193553 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" podStartSLOduration=1.822293597 podStartE2EDuration="10.193532348s" podCreationTimestamp="2025-11-22 03:05:15 +0000 UTC" firstStartedPulling="2025-11-22 03:05:15.784107384 +0000 UTC m=+751.822629266" lastFinishedPulling="2025-11-22 03:05:24.155346115 +0000 UTC m=+760.193868017" observedRunningTime="2025-11-22 03:05:25.190151297 +0000 UTC m=+761.228673229" watchObservedRunningTime="2025-11-22 03:05:25.193532348 +0000 UTC m=+761.232054250" Nov 22 03:05:25 crc kubenswrapper[4922]: I1122 03:05:25.370225 4922 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 03:05:26 crc kubenswrapper[4922]: I1122 03:05:26.177771 4922 generic.go:334] "Generic (PLEG): container finished" podID="85602553-aa6e-40ba-b92b-96b851a002ca" containerID="0b76405288cc03c95e6b4d17413242db5aa0acd182b82b07784ddd47074c35b4" exitCode=0 Nov 22 03:05:26 crc kubenswrapper[4922]: I1122 03:05:26.179579 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerDied","Data":"0b76405288cc03c95e6b4d17413242db5aa0acd182b82b07784ddd47074c35b4"} Nov 22 03:05:27 crc kubenswrapper[4922]: I1122 03:05:27.189449 4922 generic.go:334] "Generic (PLEG): container finished" podID="85602553-aa6e-40ba-b92b-96b851a002ca" containerID="eac156b565ec33fb534b75b35e50ac460847dd23444a6db8a92853cdec2d1267" exitCode=0 Nov 22 03:05:27 crc kubenswrapper[4922]: I1122 03:05:27.189680 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerDied","Data":"eac156b565ec33fb534b75b35e50ac460847dd23444a6db8a92853cdec2d1267"} Nov 22 03:05:28 crc kubenswrapper[4922]: I1122 03:05:28.199897 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"545a24be60d9a812fada546f822d4a066a72851ebc5349ec51846801815efdb8"} Nov 22 03:05:28 crc kubenswrapper[4922]: I1122 03:05:28.200220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"f4e27e73af2ad2321b0bf41a0edef3420ebb7efa5eaec91a401ba3ff22d6f985"} Nov 22 03:05:28 crc kubenswrapper[4922]: I1122 03:05:28.200236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"e9832049127e87e72105de0b7f664f0ead2bc0d2fc35a55ec02d294f17d4ebf0"} Nov 22 03:05:28 crc kubenswrapper[4922]: I1122 03:05:28.200247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"a2b4faafae9251042e5d29624a61197f5e355293bf095f0a7c5512009775b4a4"} Nov 22 03:05:28 crc kubenswrapper[4922]: I1122 03:05:28.200258 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"ca44cb04abc4e0f868e5053d23b109d87978b055fc1b47cd01540f4398c3cbaa"} Nov 22 03:05:29 crc kubenswrapper[4922]: I1122 03:05:29.210507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bjqlk" event={"ID":"85602553-aa6e-40ba-b92b-96b851a002ca","Type":"ContainerStarted","Data":"467cf74729b9d4cbe846563eb09624767915a6e9bff194de8a1b77d3e5b1041d"} Nov 22 03:05:29 crc kubenswrapper[4922]: I1122 03:05:29.210800 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:29 crc kubenswrapper[4922]: I1122 03:05:29.234728 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bjqlk" podStartSLOduration=5.786794196 podStartE2EDuration="14.234712551s" podCreationTimestamp="2025-11-22 03:05:15 +0000 UTC" firstStartedPulling="2025-11-22 03:05:15.665081822 +0000 UTC m=+751.703603714" lastFinishedPulling="2025-11-22 03:05:24.113000137 +0000 UTC m=+760.151522069" observedRunningTime="2025-11-22 03:05:29.231723698 +0000 UTC m=+765.270245610" watchObservedRunningTime="2025-11-22 03:05:29.234712551 +0000 UTC m=+765.273234443" Nov 22 03:05:29 crc kubenswrapper[4922]: I1122 03:05:29.258642 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tl5mq" Nov 22 03:05:30 crc kubenswrapper[4922]: I1122 03:05:30.463930 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:30 crc kubenswrapper[4922]: I1122 03:05:30.517144 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.369672 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7n2fj"] Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.370743 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.392279 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9z2\" (UniqueName: \"kubernetes.io/projected/a86b6301-b784-4174-a496-b68b96f5e35b-kube-api-access-cw9z2\") pod \"openstack-operator-index-7n2fj\" (UID: \"a86b6301-b784-4174-a496-b68b96f5e35b\") " pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.409936 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.412176 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.421492 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7n2fj"] Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.493427 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9z2\" (UniqueName: \"kubernetes.io/projected/a86b6301-b784-4174-a496-b68b96f5e35b-kube-api-access-cw9z2\") pod \"openstack-operator-index-7n2fj\" (UID: \"a86b6301-b784-4174-a496-b68b96f5e35b\") " pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.527496 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9z2\" (UniqueName: \"kubernetes.io/projected/a86b6301-b784-4174-a496-b68b96f5e35b-kube-api-access-cw9z2\") pod \"openstack-operator-index-7n2fj\" (UID: \"a86b6301-b784-4174-a496-b68b96f5e35b\") " pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:32 crc kubenswrapper[4922]: I1122 03:05:32.728029 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:33 crc kubenswrapper[4922]: I1122 03:05:33.179245 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7n2fj"] Nov 22 03:05:33 crc kubenswrapper[4922]: W1122 03:05:33.184831 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86b6301_b784_4174_a496_b68b96f5e35b.slice/crio-6868a831cbec20fe67139696dc14334f612c135040b49b539f256700e5fa159a WatchSource:0}: Error finding container 6868a831cbec20fe67139696dc14334f612c135040b49b539f256700e5fa159a: Status 404 returned error can't find the container with id 6868a831cbec20fe67139696dc14334f612c135040b49b539f256700e5fa159a Nov 22 03:05:33 crc kubenswrapper[4922]: I1122 03:05:33.243680 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7n2fj" event={"ID":"a86b6301-b784-4174-a496-b68b96f5e35b","Type":"ContainerStarted","Data":"6868a831cbec20fe67139696dc14334f612c135040b49b539f256700e5fa159a"} Nov 22 03:05:34 crc kubenswrapper[4922]: I1122 03:05:34.717368 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7n2fj"] Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.134523 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-tnkdx"] Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.136124 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.141468 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-rjqgg" Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.150522 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tnkdx"] Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.249011 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlqwh\" (UniqueName: \"kubernetes.io/projected/b3a11ecb-39ed-423e-b79b-19694d816305-kube-api-access-zlqwh\") pod \"openstack-operator-index-tnkdx\" (UID: \"b3a11ecb-39ed-423e-b79b-19694d816305\") " pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.350897 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlqwh\" (UniqueName: \"kubernetes.io/projected/b3a11ecb-39ed-423e-b79b-19694d816305-kube-api-access-zlqwh\") pod \"openstack-operator-index-tnkdx\" (UID: \"b3a11ecb-39ed-423e-b79b-19694d816305\") " pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.380645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlqwh\" (UniqueName: \"kubernetes.io/projected/b3a11ecb-39ed-423e-b79b-19694d816305-kube-api-access-zlqwh\") pod \"openstack-operator-index-tnkdx\" (UID: \"b3a11ecb-39ed-423e-b79b-19694d816305\") " pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.458240 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-8zlpf" Nov 22 03:05:35 crc kubenswrapper[4922]: I1122 03:05:35.474680 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:36 crc kubenswrapper[4922]: I1122 03:05:36.269786 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-h2kgl" Nov 22 03:05:36 crc kubenswrapper[4922]: I1122 03:05:36.340998 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-tnkdx"] Nov 22 03:05:37 crc kubenswrapper[4922]: I1122 03:05:37.278023 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tnkdx" event={"ID":"b3a11ecb-39ed-423e-b79b-19694d816305","Type":"ContainerStarted","Data":"afc6cbb30cfdaaf02211ea0b7926f8924b99c1c65c0a2fa594e6b22c582dda8e"} Nov 22 03:05:39 crc kubenswrapper[4922]: I1122 03:05:39.296927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7n2fj" event={"ID":"a86b6301-b784-4174-a496-b68b96f5e35b","Type":"ContainerStarted","Data":"15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970"} Nov 22 03:05:39 crc kubenswrapper[4922]: I1122 03:05:39.297046 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7n2fj" podUID="a86b6301-b784-4174-a496-b68b96f5e35b" containerName="registry-server" containerID="cri-o://15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970" gracePeriod=2 Nov 22 03:05:39 crc kubenswrapper[4922]: I1122 03:05:39.318521 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-tnkdx" event={"ID":"b3a11ecb-39ed-423e-b79b-19694d816305","Type":"ContainerStarted","Data":"9fe1512cc9e71adf2af9ffa65415e4b86240d0383fe66fba35fe1396a5438d9c"} Nov 22 03:05:39 crc kubenswrapper[4922]: I1122 03:05:39.331696 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7n2fj" podStartSLOduration=2.008089471 podStartE2EDuration="7.331660357s" podCreationTimestamp="2025-11-22 03:05:32 +0000 UTC" firstStartedPulling="2025-11-22 03:05:33.187910418 +0000 UTC m=+769.226432340" lastFinishedPulling="2025-11-22 03:05:38.511481314 +0000 UTC m=+774.550003226" observedRunningTime="2025-11-22 03:05:39.323154703 +0000 UTC m=+775.361676625" watchObservedRunningTime="2025-11-22 03:05:39.331660357 +0000 UTC m=+775.370182289" Nov 22 03:05:39 crc kubenswrapper[4922]: I1122 03:05:39.355412 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-tnkdx" podStartSLOduration=2.361823351 podStartE2EDuration="4.355379397s" podCreationTimestamp="2025-11-22 03:05:35 +0000 UTC" firstStartedPulling="2025-11-22 03:05:36.516324429 +0000 UTC m=+772.554846331" lastFinishedPulling="2025-11-22 03:05:38.509880465 +0000 UTC m=+774.548402377" observedRunningTime="2025-11-22 03:05:39.352533989 +0000 UTC m=+775.391055921" watchObservedRunningTime="2025-11-22 03:05:39.355379397 +0000 UTC m=+775.393901329" Nov 22 03:05:39 crc kubenswrapper[4922]: I1122 03:05:39.850309 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.034407 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw9z2\" (UniqueName: \"kubernetes.io/projected/a86b6301-b784-4174-a496-b68b96f5e35b-kube-api-access-cw9z2\") pod \"a86b6301-b784-4174-a496-b68b96f5e35b\" (UID: \"a86b6301-b784-4174-a496-b68b96f5e35b\") " Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.042920 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86b6301-b784-4174-a496-b68b96f5e35b-kube-api-access-cw9z2" (OuterVolumeSpecName: "kube-api-access-cw9z2") pod "a86b6301-b784-4174-a496-b68b96f5e35b" (UID: "a86b6301-b784-4174-a496-b68b96f5e35b"). InnerVolumeSpecName "kube-api-access-cw9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.136716 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw9z2\" (UniqueName: \"kubernetes.io/projected/a86b6301-b784-4174-a496-b68b96f5e35b-kube-api-access-cw9z2\") on node \"crc\" DevicePath \"\"" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.311622 4922 generic.go:334] "Generic (PLEG): container finished" podID="a86b6301-b784-4174-a496-b68b96f5e35b" containerID="15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970" exitCode=0 Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.311668 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7n2fj" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.311719 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7n2fj" event={"ID":"a86b6301-b784-4174-a496-b68b96f5e35b","Type":"ContainerDied","Data":"15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970"} Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.311778 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7n2fj" event={"ID":"a86b6301-b784-4174-a496-b68b96f5e35b","Type":"ContainerDied","Data":"6868a831cbec20fe67139696dc14334f612c135040b49b539f256700e5fa159a"} Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.311808 4922 scope.go:117] "RemoveContainer" containerID="15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.340827 4922 scope.go:117] "RemoveContainer" containerID="15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970" Nov 22 03:05:40 crc kubenswrapper[4922]: E1122 03:05:40.341531 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970\": container with ID starting with 15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970 not found: ID does not exist" containerID="15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.341607 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970"} err="failed to get container status \"15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970\": rpc error: code = NotFound desc = could not find container \"15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970\": container with ID starting with 15e19c6c13f55e3c0404bf5d64c4734d1f9260437891ea549219dde772899970 not found: ID does not exist" Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.349362 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7n2fj"] Nov 22 03:05:40 crc kubenswrapper[4922]: I1122 03:05:40.358695 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7n2fj"] Nov 22 03:05:41 crc kubenswrapper[4922]: I1122 03:05:41.314173 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86b6301-b784-4174-a496-b68b96f5e35b" path="/var/lib/kubelet/pods/a86b6301-b784-4174-a496-b68b96f5e35b/volumes" Nov 22 03:05:45 crc kubenswrapper[4922]: I1122 03:05:45.467783 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bjqlk" Nov 22 03:05:45 crc kubenswrapper[4922]: I1122 03:05:45.475110 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:45 crc kubenswrapper[4922]: I1122 03:05:45.475156 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:45 crc kubenswrapper[4922]: I1122 03:05:45.537457 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:46 crc kubenswrapper[4922]: I1122 03:05:46.412284 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-tnkdx" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.789833 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp"] Nov 22 03:05:58 crc kubenswrapper[4922]: E1122 03:05:58.791208 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86b6301-b784-4174-a496-b68b96f5e35b" containerName="registry-server" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.791229 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86b6301-b784-4174-a496-b68b96f5e35b" containerName="registry-server" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.791479 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86b6301-b784-4174-a496-b68b96f5e35b" containerName="registry-server" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.792889 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.796536 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2h6qm" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.812156 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp"] Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.848988 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-util\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.849061 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-bundle\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.849235 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87gs\" (UniqueName: \"kubernetes.io/projected/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-kube-api-access-g87gs\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.950967 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g87gs\" (UniqueName: \"kubernetes.io/projected/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-kube-api-access-g87gs\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.951117 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-util\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.951153 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-bundle\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.952279 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-util\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.952719 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-bundle\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:58 crc kubenswrapper[4922]: I1122 03:05:58.985675 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87gs\" (UniqueName: \"kubernetes.io/projected/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-kube-api-access-g87gs\") pod \"1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:59 crc kubenswrapper[4922]: I1122 03:05:59.129619 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:05:59 crc kubenswrapper[4922]: I1122 03:05:59.672686 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp"] Nov 22 03:06:00 crc kubenswrapper[4922]: I1122 03:06:00.473059 4922 generic.go:334] "Generic (PLEG): container finished" podID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerID="ae1f5cd1c8a0e6e68e944d90645870bfd4cacce03a2fcf770d97d44f903683b5" exitCode=0 Nov 22 03:06:00 crc kubenswrapper[4922]: I1122 03:06:00.473163 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" event={"ID":"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b","Type":"ContainerDied","Data":"ae1f5cd1c8a0e6e68e944d90645870bfd4cacce03a2fcf770d97d44f903683b5"} Nov 22 03:06:00 crc kubenswrapper[4922]: I1122 03:06:00.473220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" event={"ID":"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b","Type":"ContainerStarted","Data":"651ac6762e1b37f95871808297ea92788b15ef44278012f05bc5187987fb1af3"} Nov 22 03:06:01 crc kubenswrapper[4922]: I1122 03:06:01.487280 4922 generic.go:334] "Generic (PLEG): container finished" podID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerID="1e79d06c5adf883736ffef4cce2ea751cd28cc37da22d7aa189ac947558beecf" exitCode=0 Nov 22 03:06:01 crc kubenswrapper[4922]: I1122 03:06:01.487390 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" event={"ID":"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b","Type":"ContainerDied","Data":"1e79d06c5adf883736ffef4cce2ea751cd28cc37da22d7aa189ac947558beecf"} Nov 22 03:06:02 crc kubenswrapper[4922]: I1122 03:06:02.500670 4922 generic.go:334] "Generic (PLEG): container finished" podID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerID="c15c7b3c25e85b19641e2bd48940aa30dc7e1e891fa69366a609807693339efb" exitCode=0 Nov 22 03:06:02 crc kubenswrapper[4922]: I1122 03:06:02.500788 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" event={"ID":"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b","Type":"ContainerDied","Data":"c15c7b3c25e85b19641e2bd48940aa30dc7e1e891fa69366a609807693339efb"} Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.886765 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.932460 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-bundle\") pod \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.933206 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g87gs\" (UniqueName: \"kubernetes.io/projected/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-kube-api-access-g87gs\") pod \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.933282 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-util\") pod \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\" (UID: \"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b\") " Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.933703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-bundle" (OuterVolumeSpecName: "bundle") pod "4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" (UID: "4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.942142 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-kube-api-access-g87gs" (OuterVolumeSpecName: "kube-api-access-g87gs") pod "4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" (UID: "4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b"). InnerVolumeSpecName "kube-api-access-g87gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:06:03 crc kubenswrapper[4922]: I1122 03:06:03.963518 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-util" (OuterVolumeSpecName: "util") pod "4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" (UID: "4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:06:04 crc kubenswrapper[4922]: I1122 03:06:04.035823 4922 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:04 crc kubenswrapper[4922]: I1122 03:06:04.036296 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g87gs\" (UniqueName: \"kubernetes.io/projected/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-kube-api-access-g87gs\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:04 crc kubenswrapper[4922]: I1122 03:06:04.036429 4922 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b-util\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:04 crc kubenswrapper[4922]: I1122 03:06:04.522990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" event={"ID":"4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b","Type":"ContainerDied","Data":"651ac6762e1b37f95871808297ea92788b15ef44278012f05bc5187987fb1af3"} Nov 22 03:06:04 crc kubenswrapper[4922]: I1122 03:06:04.523465 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651ac6762e1b37f95871808297ea92788b15ef44278012f05bc5187987fb1af3" Nov 22 03:06:04 crc kubenswrapper[4922]: I1122 03:06:04.523073 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.142419 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mnpjh"] Nov 22 03:06:05 crc kubenswrapper[4922]: E1122 03:06:05.142786 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="util" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.142809 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="util" Nov 22 03:06:05 crc kubenswrapper[4922]: E1122 03:06:05.142841 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="pull" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.142888 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="pull" Nov 22 03:06:05 crc kubenswrapper[4922]: E1122 03:06:05.142914 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="extract" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.142927 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="extract" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.143136 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b" containerName="extract" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.144599 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.168886 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnpjh"] Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.262930 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqnq7\" (UniqueName: \"kubernetes.io/projected/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-kube-api-access-kqnq7\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.263021 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-catalog-content\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.263222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-utilities\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.364672 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqnq7\" (UniqueName: \"kubernetes.io/projected/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-kube-api-access-kqnq7\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.364727 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-catalog-content\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.364810 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-utilities\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.365302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-catalog-content\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.365437 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-utilities\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.393086 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqnq7\" (UniqueName: \"kubernetes.io/projected/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-kube-api-access-kqnq7\") pod \"redhat-marketplace-mnpjh\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.472993 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:05 crc kubenswrapper[4922]: I1122 03:06:05.980313 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnpjh"] Nov 22 03:06:05 crc kubenswrapper[4922]: W1122 03:06:05.985905 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abd79d4_bbcb_4d1f_a015_8a385d7846d1.slice/crio-736fef2c3cd8869b82d126c67820e3b696306dbc57a68465ddcefd755dc20978 WatchSource:0}: Error finding container 736fef2c3cd8869b82d126c67820e3b696306dbc57a68465ddcefd755dc20978: Status 404 returned error can't find the container with id 736fef2c3cd8869b82d126c67820e3b696306dbc57a68465ddcefd755dc20978 Nov 22 03:06:06 crc kubenswrapper[4922]: I1122 03:06:06.543325 4922 generic.go:334] "Generic (PLEG): container finished" podID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerID="ef4b5bb5384defe75059dbd2e1c7d3d028bf8e81424abea1f7df5cb62daf2633" exitCode=0 Nov 22 03:06:06 crc kubenswrapper[4922]: I1122 03:06:06.543380 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnpjh" event={"ID":"3abd79d4-bbcb-4d1f-a015-8a385d7846d1","Type":"ContainerDied","Data":"ef4b5bb5384defe75059dbd2e1c7d3d028bf8e81424abea1f7df5cb62daf2633"} Nov 22 03:06:06 crc kubenswrapper[4922]: I1122 03:06:06.543442 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnpjh" event={"ID":"3abd79d4-bbcb-4d1f-a015-8a385d7846d1","Type":"ContainerStarted","Data":"736fef2c3cd8869b82d126c67820e3b696306dbc57a68465ddcefd755dc20978"} Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.019168 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl"] Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.020761 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.023041 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-9nvjr" Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.057739 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl"] Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.195409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znn6b\" (UniqueName: \"kubernetes.io/projected/0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b-kube-api-access-znn6b\") pod \"openstack-operator-controller-operator-8495cbd6cf-sz4cl\" (UID: \"0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b\") " pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.296945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znn6b\" (UniqueName: \"kubernetes.io/projected/0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b-kube-api-access-znn6b\") pod \"openstack-operator-controller-operator-8495cbd6cf-sz4cl\" (UID: \"0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b\") " pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.333441 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znn6b\" (UniqueName: \"kubernetes.io/projected/0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b-kube-api-access-znn6b\") pod \"openstack-operator-controller-operator-8495cbd6cf-sz4cl\" (UID: \"0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b\") " pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.339528 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:07 crc kubenswrapper[4922]: I1122 03:06:07.626148 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl"] Nov 22 03:06:08 crc kubenswrapper[4922]: I1122 03:06:08.559772 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" event={"ID":"0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b","Type":"ContainerStarted","Data":"1bff90af1938333fa97a0227db8bf47ab47edd583ab5add33c9651d1bb4c9b3e"} Nov 22 03:06:08 crc kubenswrapper[4922]: I1122 03:06:08.562640 4922 generic.go:334] "Generic (PLEG): container finished" podID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerID="a8340c91cd433852c130f0bf0d56fa6523bb6a9eb4661b3746c87824e5b0e417" exitCode=0 Nov 22 03:06:08 crc kubenswrapper[4922]: I1122 03:06:08.562670 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnpjh" event={"ID":"3abd79d4-bbcb-4d1f-a015-8a385d7846d1","Type":"ContainerDied","Data":"a8340c91cd433852c130f0bf0d56fa6523bb6a9eb4661b3746c87824e5b0e417"} Nov 22 03:06:12 crc kubenswrapper[4922]: I1122 03:06:12.594005 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnpjh" event={"ID":"3abd79d4-bbcb-4d1f-a015-8a385d7846d1","Type":"ContainerStarted","Data":"3b37698f288bb4918ca6c221cc582abecde8673e50b09b034489fa1fc9c1f5ad"} Nov 22 03:06:12 crc kubenswrapper[4922]: I1122 03:06:12.596931 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" event={"ID":"0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b","Type":"ContainerStarted","Data":"37ffb0ceb13c3cfff2d716270caee0929ef245181d3aab51dd4f6ca31015c8f3"} Nov 22 03:06:12 crc kubenswrapper[4922]: I1122 03:06:12.618002 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mnpjh" podStartSLOduration=2.271397175 podStartE2EDuration="7.617981085s" podCreationTimestamp="2025-11-22 03:06:05 +0000 UTC" firstStartedPulling="2025-11-22 03:06:06.547579929 +0000 UTC m=+802.586101821" lastFinishedPulling="2025-11-22 03:06:11.894163829 +0000 UTC m=+807.932685731" observedRunningTime="2025-11-22 03:06:12.616972692 +0000 UTC m=+808.655494584" watchObservedRunningTime="2025-11-22 03:06:12.617981085 +0000 UTC m=+808.656502987" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.537798 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-srvcg"] Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.546149 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.553878 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srvcg"] Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.698518 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jv8\" (UniqueName: \"kubernetes.io/projected/8986cde2-d986-4da6-b75a-04fa39a5c4e8-kube-api-access-w5jv8\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.698672 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-catalog-content\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.698720 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-utilities\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.799731 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-utilities\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.799815 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5jv8\" (UniqueName: \"kubernetes.io/projected/8986cde2-d986-4da6-b75a-04fa39a5c4e8-kube-api-access-w5jv8\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.799935 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-catalog-content\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.800517 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-catalog-content\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.800564 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-utilities\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.825003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5jv8\" (UniqueName: \"kubernetes.io/projected/8986cde2-d986-4da6-b75a-04fa39a5c4e8-kube-api-access-w5jv8\") pod \"community-operators-srvcg\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:13 crc kubenswrapper[4922]: I1122 03:06:13.867868 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:14 crc kubenswrapper[4922]: I1122 03:06:14.581978 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srvcg"] Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.474125 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.474675 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.550974 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.624088 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" event={"ID":"0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b","Type":"ContainerStarted","Data":"71b5f604a59bf1c98786423fe7bcc94fbd4da0199148770cee130c27bf73e512"} Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.624276 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.627158 4922 generic.go:334] "Generic (PLEG): container finished" podID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerID="d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173" exitCode=0 Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.627565 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerDied","Data":"d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173"} Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.627642 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerStarted","Data":"eba42b294e0d23fbac84952f298caab1ea8f8713a22312b232c9473d99ddf59e"} Nov 22 03:06:15 crc kubenswrapper[4922]: I1122 03:06:15.677198 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" podStartSLOduration=2.385318909 podStartE2EDuration="9.677165845s" podCreationTimestamp="2025-11-22 03:06:06 +0000 UTC" firstStartedPulling="2025-11-22 03:06:07.64223747 +0000 UTC m=+803.680759362" lastFinishedPulling="2025-11-22 03:06:14.934084396 +0000 UTC m=+810.972606298" observedRunningTime="2025-11-22 03:06:15.667808461 +0000 UTC m=+811.706330353" watchObservedRunningTime="2025-11-22 03:06:15.677165845 +0000 UTC m=+811.715687777" Nov 22 03:06:16 crc kubenswrapper[4922]: I1122 03:06:16.639029 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerStarted","Data":"fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589"} Nov 22 03:06:17 crc kubenswrapper[4922]: I1122 03:06:17.343734 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-8495cbd6cf-sz4cl" Nov 22 03:06:17 crc kubenswrapper[4922]: I1122 03:06:17.657318 4922 generic.go:334] "Generic (PLEG): container finished" podID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerID="fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589" exitCode=0 Nov 22 03:06:17 crc kubenswrapper[4922]: I1122 03:06:17.657451 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerDied","Data":"fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589"} Nov 22 03:06:18 crc kubenswrapper[4922]: I1122 03:06:18.665290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerStarted","Data":"8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91"} Nov 22 03:06:18 crc kubenswrapper[4922]: I1122 03:06:18.690527 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-srvcg" podStartSLOduration=3.273959261 podStartE2EDuration="5.690509824s" podCreationTimestamp="2025-11-22 03:06:13 +0000 UTC" firstStartedPulling="2025-11-22 03:06:15.62951783 +0000 UTC m=+811.668039762" lastFinishedPulling="2025-11-22 03:06:18.046068403 +0000 UTC m=+814.084590325" observedRunningTime="2025-11-22 03:06:18.689006768 +0000 UTC m=+814.727528700" watchObservedRunningTime="2025-11-22 03:06:18.690509824 +0000 UTC m=+814.729031716" Nov 22 03:06:23 crc kubenswrapper[4922]: I1122 03:06:23.869773 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:23 crc kubenswrapper[4922]: I1122 03:06:23.870217 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:23 crc kubenswrapper[4922]: I1122 03:06:23.952670 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:24 crc kubenswrapper[4922]: I1122 03:06:24.763652 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:24 crc kubenswrapper[4922]: I1122 03:06:24.826770 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srvcg"] Nov 22 03:06:25 crc kubenswrapper[4922]: I1122 03:06:25.542374 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:26 crc kubenswrapper[4922]: I1122 03:06:26.594622 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnpjh"] Nov 22 03:06:26 crc kubenswrapper[4922]: I1122 03:06:26.594906 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mnpjh" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="registry-server" containerID="cri-o://3b37698f288bb4918ca6c221cc582abecde8673e50b09b034489fa1fc9c1f5ad" gracePeriod=2 Nov 22 03:06:26 crc kubenswrapper[4922]: I1122 03:06:26.725626 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-srvcg" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="registry-server" containerID="cri-o://8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91" gracePeriod=2 Nov 22 03:06:27 crc kubenswrapper[4922]: I1122 03:06:27.738535 4922 generic.go:334] "Generic (PLEG): container finished" podID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerID="3b37698f288bb4918ca6c221cc582abecde8673e50b09b034489fa1fc9c1f5ad" exitCode=0 Nov 22 03:06:27 crc kubenswrapper[4922]: I1122 03:06:27.738650 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnpjh" event={"ID":"3abd79d4-bbcb-4d1f-a015-8a385d7846d1","Type":"ContainerDied","Data":"3b37698f288bb4918ca6c221cc582abecde8673e50b09b034489fa1fc9c1f5ad"} Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.215485 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.330168 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-utilities\") pod \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.330224 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-catalog-content\") pod \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.330284 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqnq7\" (UniqueName: \"kubernetes.io/projected/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-kube-api-access-kqnq7\") pod \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\" (UID: \"3abd79d4-bbcb-4d1f-a015-8a385d7846d1\") " Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.330994 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-utilities" (OuterVolumeSpecName: "utilities") pod "3abd79d4-bbcb-4d1f-a015-8a385d7846d1" (UID: "3abd79d4-bbcb-4d1f-a015-8a385d7846d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.343188 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-kube-api-access-kqnq7" (OuterVolumeSpecName: "kube-api-access-kqnq7") pod "3abd79d4-bbcb-4d1f-a015-8a385d7846d1" (UID: "3abd79d4-bbcb-4d1f-a015-8a385d7846d1"). InnerVolumeSpecName "kube-api-access-kqnq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.353914 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3abd79d4-bbcb-4d1f-a015-8a385d7846d1" (UID: "3abd79d4-bbcb-4d1f-a015-8a385d7846d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.432302 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.432351 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.432373 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqnq7\" (UniqueName: \"kubernetes.io/projected/3abd79d4-bbcb-4d1f-a015-8a385d7846d1-kube-api-access-kqnq7\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.472807 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.634977 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-catalog-content\") pod \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.635673 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-utilities\") pod \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.635725 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5jv8\" (UniqueName: \"kubernetes.io/projected/8986cde2-d986-4da6-b75a-04fa39a5c4e8-kube-api-access-w5jv8\") pod \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\" (UID: \"8986cde2-d986-4da6-b75a-04fa39a5c4e8\") " Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.637214 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-utilities" (OuterVolumeSpecName: "utilities") pod "8986cde2-d986-4da6-b75a-04fa39a5c4e8" (UID: "8986cde2-d986-4da6-b75a-04fa39a5c4e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.640792 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8986cde2-d986-4da6-b75a-04fa39a5c4e8-kube-api-access-w5jv8" (OuterVolumeSpecName: "kube-api-access-w5jv8") pod "8986cde2-d986-4da6-b75a-04fa39a5c4e8" (UID: "8986cde2-d986-4da6-b75a-04fa39a5c4e8"). InnerVolumeSpecName "kube-api-access-w5jv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.716659 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8986cde2-d986-4da6-b75a-04fa39a5c4e8" (UID: "8986cde2-d986-4da6-b75a-04fa39a5c4e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.737665 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.737742 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5jv8\" (UniqueName: \"kubernetes.io/projected/8986cde2-d986-4da6-b75a-04fa39a5c4e8-kube-api-access-w5jv8\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.737765 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8986cde2-d986-4da6-b75a-04fa39a5c4e8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.750389 4922 generic.go:334] "Generic (PLEG): container finished" podID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerID="8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91" exitCode=0 Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.750562 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srvcg" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.750485 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerDied","Data":"8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91"} Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.750876 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srvcg" event={"ID":"8986cde2-d986-4da6-b75a-04fa39a5c4e8","Type":"ContainerDied","Data":"eba42b294e0d23fbac84952f298caab1ea8f8713a22312b232c9473d99ddf59e"} Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.750947 4922 scope.go:117] "RemoveContainer" containerID="8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.755127 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnpjh" event={"ID":"3abd79d4-bbcb-4d1f-a015-8a385d7846d1","Type":"ContainerDied","Data":"736fef2c3cd8869b82d126c67820e3b696306dbc57a68465ddcefd755dc20978"} Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.755219 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnpjh" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.779938 4922 scope.go:117] "RemoveContainer" containerID="fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.810644 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnpjh"] Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.819767 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnpjh"] Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.826915 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srvcg"] Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.830796 4922 scope.go:117] "RemoveContainer" containerID="d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.833127 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-srvcg"] Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.853838 4922 scope.go:117] "RemoveContainer" containerID="8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91" Nov 22 03:06:28 crc kubenswrapper[4922]: E1122 03:06:28.854294 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91\": container with ID starting with 8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91 not found: ID does not exist" containerID="8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.854331 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91"} err="failed to get container status \"8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91\": rpc error: code = NotFound desc = could not find container \"8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91\": container with ID starting with 8ff4b17773fa063e8670eee55a7519c35530df16c7ad926a95ce370d31858f91 not found: ID does not exist" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.854359 4922 scope.go:117] "RemoveContainer" containerID="fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589" Nov 22 03:06:28 crc kubenswrapper[4922]: E1122 03:06:28.855183 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589\": container with ID starting with fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589 not found: ID does not exist" containerID="fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.855205 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589"} err="failed to get container status \"fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589\": rpc error: code = NotFound desc = could not find container \"fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589\": container with ID starting with fe04469afa3d52665831b25413694c449f03a3e4b9d85a30b95ed65bc63d5589 not found: ID does not exist" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.855218 4922 scope.go:117] "RemoveContainer" containerID="d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173" Nov 22 03:06:28 crc kubenswrapper[4922]: E1122 03:06:28.855723 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173\": container with ID starting with d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173 not found: ID does not exist" containerID="d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.855791 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173"} err="failed to get container status \"d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173\": rpc error: code = NotFound desc = could not find container \"d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173\": container with ID starting with d1c08db8286819ef3969c393c726a9aaca12d9e0e3bc768df8fcf57ea0ffe173 not found: ID does not exist" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.855834 4922 scope.go:117] "RemoveContainer" containerID="3b37698f288bb4918ca6c221cc582abecde8673e50b09b034489fa1fc9c1f5ad" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.874512 4922 scope.go:117] "RemoveContainer" containerID="a8340c91cd433852c130f0bf0d56fa6523bb6a9eb4661b3746c87824e5b0e417" Nov 22 03:06:28 crc kubenswrapper[4922]: I1122 03:06:28.899947 4922 scope.go:117] "RemoveContainer" containerID="ef4b5bb5384defe75059dbd2e1c7d3d028bf8e81424abea1f7df5cb62daf2633" Nov 22 03:06:29 crc kubenswrapper[4922]: I1122 03:06:29.315004 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" path="/var/lib/kubelet/pods/3abd79d4-bbcb-4d1f-a015-8a385d7846d1/volumes" Nov 22 03:06:29 crc kubenswrapper[4922]: I1122 03:06:29.316281 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" path="/var/lib/kubelet/pods/8986cde2-d986-4da6-b75a-04fa39a5c4e8/volumes" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.855783 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl"] Nov 22 03:06:54 crc kubenswrapper[4922]: E1122 03:06:54.856784 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="extract-utilities" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.856799 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="extract-utilities" Nov 22 03:06:54 crc kubenswrapper[4922]: E1122 03:06:54.856815 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="registry-server" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.856823 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="registry-server" Nov 22 03:06:54 crc kubenswrapper[4922]: E1122 03:06:54.856838 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="extract-utilities" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.856866 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="extract-utilities" Nov 22 03:06:54 crc kubenswrapper[4922]: E1122 03:06:54.856883 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="registry-server" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.856891 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="registry-server" Nov 22 03:06:54 crc kubenswrapper[4922]: E1122 03:06:54.856901 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="extract-content" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.856910 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="extract-content" Nov 22 03:06:54 crc kubenswrapper[4922]: E1122 03:06:54.856920 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="extract-content" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.856928 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="extract-content" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.857058 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3abd79d4-bbcb-4d1f-a015-8a385d7846d1" containerName="registry-server" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.857077 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8986cde2-d986-4da6-b75a-04fa39a5c4e8" containerName="registry-server" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.860204 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.865248 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zzfp7" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.872144 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.873698 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.880583 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.884698 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wgfpb" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.892776 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.892910 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.902225 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9bm8l" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.917699 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.917787 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.927901 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.928017 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.939001 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5nxc7" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.958367 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.966779 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.967797 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.976957 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.987146 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr"] Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.988175 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.990708 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-kkz49" Nov 22 03:06:54 crc kubenswrapper[4922]: I1122 03:06:54.991536 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-n7p7t" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.006293 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9t2n\" (UniqueName: \"kubernetes.io/projected/181257ea-c4b9-4370-80b5-7f52ed557c33-kube-api-access-x9t2n\") pod \"barbican-operator-controller-manager-5bfbbb859d-4lfxl\" (UID: \"181257ea-c4b9-4370-80b5-7f52ed557c33\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.006391 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6bw4\" (UniqueName: \"kubernetes.io/projected/8b206938-2e76-40b1-b39c-ff333430e8f6-kube-api-access-b6bw4\") pod \"cinder-operator-controller-manager-864d88ccf8-vc6p7\" (UID: \"8b206938-2e76-40b1-b39c-ff333430e8f6\") " pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.006448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7dnv\" (UniqueName: \"kubernetes.io/projected/5db39a57-6021-466f-84e0-1fc2e8cf0da9-kube-api-access-m7dnv\") pod \"designate-operator-controller-manager-6788cc6d75-dkfth\" (UID: \"5db39a57-6021-466f-84e0-1fc2e8cf0da9\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.009200 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.010696 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.014666 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.014941 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hrkjw" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.020726 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.035566 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.039520 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.044691 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-td29t" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.053006 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.054091 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.057561 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9lq5d" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.101087 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.112789 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t79k\" (UniqueName: \"kubernetes.io/projected/1cc9e6ed-fd1a-4280-8867-c8fbd326ca14-kube-api-access-2t79k\") pod \"glance-operator-controller-manager-6bd966bbd4-g9fbb\" (UID: \"1cc9e6ed-fd1a-4280-8867-c8fbd326ca14\") " pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.112888 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9t2n\" (UniqueName: \"kubernetes.io/projected/181257ea-c4b9-4370-80b5-7f52ed557c33-kube-api-access-x9t2n\") pod \"barbican-operator-controller-manager-5bfbbb859d-4lfxl\" (UID: \"181257ea-c4b9-4370-80b5-7f52ed557c33\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.112924 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z296b\" (UniqueName: \"kubernetes.io/projected/ec6a579d-cf65-4b02-a891-ca17161e6585-kube-api-access-z296b\") pod \"infra-operator-controller-manager-6c55d8d69b-twhcc\" (UID: \"ec6a579d-cf65-4b02-a891-ca17161e6585\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.112974 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6bw4\" (UniqueName: \"kubernetes.io/projected/8b206938-2e76-40b1-b39c-ff333430e8f6-kube-api-access-b6bw4\") pod \"cinder-operator-controller-manager-864d88ccf8-vc6p7\" (UID: \"8b206938-2e76-40b1-b39c-ff333430e8f6\") " pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.113087 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxg7t\" (UniqueName: \"kubernetes.io/projected/3e52f719-cfcb-48d8-a83f-1bcddb08e6bd-kube-api-access-rxg7t\") pod \"heat-operator-controller-manager-698d6fd7d6-rxzs6\" (UID: \"3e52f719-cfcb-48d8-a83f-1bcddb08e6bd\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.113139 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7dnv\" (UniqueName: \"kubernetes.io/projected/5db39a57-6021-466f-84e0-1fc2e8cf0da9-kube-api-access-m7dnv\") pod \"designate-operator-controller-manager-6788cc6d75-dkfth\" (UID: \"5db39a57-6021-466f-84e0-1fc2e8cf0da9\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.113198 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6a579d-cf65-4b02-a891-ca17161e6585-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-twhcc\" (UID: \"ec6a579d-cf65-4b02-a891-ca17161e6585\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.113282 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9b8\" (UniqueName: \"kubernetes.io/projected/8262c40a-af33-42fb-9347-e5d84f97a20d-kube-api-access-jr9b8\") pod \"horizon-operator-controller-manager-7d5d9fd47f-v6hmr\" (UID: \"8262c40a-af33-42fb-9347-e5d84f97a20d\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.152639 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.175998 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.176573 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nm4km" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.177203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7dnv\" (UniqueName: \"kubernetes.io/projected/5db39a57-6021-466f-84e0-1fc2e8cf0da9-kube-api-access-m7dnv\") pod \"designate-operator-controller-manager-6788cc6d75-dkfth\" (UID: \"5db39a57-6021-466f-84e0-1fc2e8cf0da9\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.181223 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.186117 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6bw4\" (UniqueName: \"kubernetes.io/projected/8b206938-2e76-40b1-b39c-ff333430e8f6-kube-api-access-b6bw4\") pod \"cinder-operator-controller-manager-864d88ccf8-vc6p7\" (UID: \"8b206938-2e76-40b1-b39c-ff333430e8f6\") " pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.191677 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9t2n\" (UniqueName: \"kubernetes.io/projected/181257ea-c4b9-4370-80b5-7f52ed557c33-kube-api-access-x9t2n\") pod \"barbican-operator-controller-manager-5bfbbb859d-4lfxl\" (UID: \"181257ea-c4b9-4370-80b5-7f52ed557c33\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.191746 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.193118 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.199186 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-25466" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.204426 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214368 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftz4p\" (UniqueName: \"kubernetes.io/projected/ddb44566-f024-43bd-8bc8-7b497606baa7-kube-api-access-ftz4p\") pod \"manila-operator-controller-manager-646fd589f9-f7bz2\" (UID: \"ddb44566-f024-43bd-8bc8-7b497606baa7\") " pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214423 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6a579d-cf65-4b02-a891-ca17161e6585-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-twhcc\" (UID: \"ec6a579d-cf65-4b02-a891-ca17161e6585\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214460 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9b8\" (UniqueName: \"kubernetes.io/projected/8262c40a-af33-42fb-9347-e5d84f97a20d-kube-api-access-jr9b8\") pod \"horizon-operator-controller-manager-7d5d9fd47f-v6hmr\" (UID: \"8262c40a-af33-42fb-9347-e5d84f97a20d\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214500 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t79k\" (UniqueName: \"kubernetes.io/projected/1cc9e6ed-fd1a-4280-8867-c8fbd326ca14-kube-api-access-2t79k\") pod \"glance-operator-controller-manager-6bd966bbd4-g9fbb\" (UID: \"1cc9e6ed-fd1a-4280-8867-c8fbd326ca14\") " pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214524 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z296b\" (UniqueName: \"kubernetes.io/projected/ec6a579d-cf65-4b02-a891-ca17161e6585-kube-api-access-z296b\") pod \"infra-operator-controller-manager-6c55d8d69b-twhcc\" (UID: \"ec6a579d-cf65-4b02-a891-ca17161e6585\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214546 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzqn\" (UniqueName: \"kubernetes.io/projected/4bc61f3e-c538-4a90-84da-7cd4760621f1-kube-api-access-4zzqn\") pod \"keystone-operator-controller-manager-7d6f5d799-5szbr\" (UID: \"4bc61f3e-c538-4a90-84da-7cd4760621f1\") " pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrht\" (UniqueName: \"kubernetes.io/projected/0f9c4cd6-8ab3-4895-ab12-74dce3828cf8-kube-api-access-8rrht\") pod \"ironic-operator-controller-manager-54485f899-vg7qw\" (UID: \"0f9c4cd6-8ab3-4895-ab12-74dce3828cf8\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.214607 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxg7t\" (UniqueName: \"kubernetes.io/projected/3e52f719-cfcb-48d8-a83f-1bcddb08e6bd-kube-api-access-rxg7t\") pod \"heat-operator-controller-manager-698d6fd7d6-rxzs6\" (UID: \"3e52f719-cfcb-48d8-a83f-1bcddb08e6bd\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.229473 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.239271 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec6a579d-cf65-4b02-a891-ca17161e6585-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-twhcc\" (UID: \"ec6a579d-cf65-4b02-a891-ca17161e6585\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.240483 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.250494 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxg7t\" (UniqueName: \"kubernetes.io/projected/3e52f719-cfcb-48d8-a83f-1bcddb08e6bd-kube-api-access-rxg7t\") pod \"heat-operator-controller-manager-698d6fd7d6-rxzs6\" (UID: \"3e52f719-cfcb-48d8-a83f-1bcddb08e6bd\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.250568 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.259979 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.272512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t79k\" (UniqueName: \"kubernetes.io/projected/1cc9e6ed-fd1a-4280-8867-c8fbd326ca14-kube-api-access-2t79k\") pod \"glance-operator-controller-manager-6bd966bbd4-g9fbb\" (UID: \"1cc9e6ed-fd1a-4280-8867-c8fbd326ca14\") " pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.273394 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z296b\" (UniqueName: \"kubernetes.io/projected/ec6a579d-cf65-4b02-a891-ca17161e6585-kube-api-access-z296b\") pod \"infra-operator-controller-manager-6c55d8d69b-twhcc\" (UID: \"ec6a579d-cf65-4b02-a891-ca17161e6585\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.294423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9b8\" (UniqueName: \"kubernetes.io/projected/8262c40a-af33-42fb-9347-e5d84f97a20d-kube-api-access-jr9b8\") pod \"horizon-operator-controller-manager-7d5d9fd47f-v6hmr\" (UID: \"8262c40a-af33-42fb-9347-e5d84f97a20d\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.297967 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.303288 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.310973 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.325279 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4857c" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.327222 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftz4p\" (UniqueName: \"kubernetes.io/projected/ddb44566-f024-43bd-8bc8-7b497606baa7-kube-api-access-ftz4p\") pod \"manila-operator-controller-manager-646fd589f9-f7bz2\" (UID: \"ddb44566-f024-43bd-8bc8-7b497606baa7\") " pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.327312 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwg56\" (UniqueName: \"kubernetes.io/projected/459b1df7-6ed9-4ef4-bb71-aa7e82001d5a-kube-api-access-hwg56\") pod \"mariadb-operator-controller-manager-64d7c556cd-h7jpp\" (UID: \"459b1df7-6ed9-4ef4-bb71-aa7e82001d5a\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.327394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzqn\" (UniqueName: \"kubernetes.io/projected/4bc61f3e-c538-4a90-84da-7cd4760621f1-kube-api-access-4zzqn\") pod \"keystone-operator-controller-manager-7d6f5d799-5szbr\" (UID: \"4bc61f3e-c538-4a90-84da-7cd4760621f1\") " pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.327421 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrht\" (UniqueName: \"kubernetes.io/projected/0f9c4cd6-8ab3-4895-ab12-74dce3828cf8-kube-api-access-8rrht\") pod \"ironic-operator-controller-manager-54485f899-vg7qw\" (UID: \"0f9c4cd6-8ab3-4895-ab12-74dce3828cf8\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.331825 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.333412 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.335497 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sspwl" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.340820 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.347095 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.349368 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftz4p\" (UniqueName: \"kubernetes.io/projected/ddb44566-f024-43bd-8bc8-7b497606baa7-kube-api-access-ftz4p\") pod \"manila-operator-controller-manager-646fd589f9-f7bz2\" (UID: \"ddb44566-f024-43bd-8bc8-7b497606baa7\") " pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.353346 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.354523 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrht\" (UniqueName: \"kubernetes.io/projected/0f9c4cd6-8ab3-4895-ab12-74dce3828cf8-kube-api-access-8rrht\") pod \"ironic-operator-controller-manager-54485f899-vg7qw\" (UID: \"0f9c4cd6-8ab3-4895-ab12-74dce3828cf8\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.354541 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzqn\" (UniqueName: \"kubernetes.io/projected/4bc61f3e-c538-4a90-84da-7cd4760621f1-kube-api-access-4zzqn\") pod \"keystone-operator-controller-manager-7d6f5d799-5szbr\" (UID: \"4bc61f3e-c538-4a90-84da-7cd4760621f1\") " pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.358755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.393878 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.394496 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.396923 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.399189 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.399497 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-djpcq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.422495 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.424277 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.427234 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nqqbd" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.428743 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.428804 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwg56\" (UniqueName: \"kubernetes.io/projected/459b1df7-6ed9-4ef4-bb71-aa7e82001d5a-kube-api-access-hwg56\") pod \"mariadb-operator-controller-manager-64d7c556cd-h7jpp\" (UID: \"459b1df7-6ed9-4ef4-bb71-aa7e82001d5a\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.428867 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqsg4\" (UniqueName: \"kubernetes.io/projected/cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd-kube-api-access-pqsg4\") pod \"nova-operator-controller-manager-79d658b66d-2ldtk\" (UID: \"cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.428898 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb87h\" (UniqueName: \"kubernetes.io/projected/c4755f19-1e55-41bb-be1e-b4b868c48cc1-kube-api-access-bb87h\") pod \"neutron-operator-controller-manager-6b6c55ffd5-nlwm2\" (UID: \"c4755f19-1e55-41bb-be1e-b4b868c48cc1\") " pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.444931 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.459193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.471043 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.482559 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.482713 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.490585 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-whq5x" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.492047 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.493864 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-5fntg"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.495236 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.498374 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-thzpg" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.499008 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.499956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.499976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwg56\" (UniqueName: \"kubernetes.io/projected/459b1df7-6ed9-4ef4-bb71-aa7e82001d5a-kube-api-access-hwg56\") pod \"mariadb-operator-controller-manager-64d7c556cd-h7jpp\" (UID: \"459b1df7-6ed9-4ef4-bb71-aa7e82001d5a\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.501508 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xttlz" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.509392 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-5fntg"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.524216 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.530377 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwtd\" (UniqueName: \"kubernetes.io/projected/4134b0be-83c2-452c-a09e-6a699543d2c0-kube-api-access-6nwtd\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.530440 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqsg4\" (UniqueName: \"kubernetes.io/projected/cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd-kube-api-access-pqsg4\") pod \"nova-operator-controller-manager-79d658b66d-2ldtk\" (UID: \"cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.539533 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb87h\" (UniqueName: \"kubernetes.io/projected/c4755f19-1e55-41bb-be1e-b4b868c48cc1-kube-api-access-bb87h\") pod \"neutron-operator-controller-manager-6b6c55ffd5-nlwm2\" (UID: \"c4755f19-1e55-41bb-be1e-b4b868c48cc1\") " pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.539682 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27j7c\" (UniqueName: \"kubernetes.io/projected/225f6b3a-93d7-46d9-99a1-d9787b4921fb-kube-api-access-27j7c\") pod \"octavia-operator-controller-manager-7979c68bc7-7fxg5\" (UID: \"225f6b3a-93d7-46d9-99a1-d9787b4921fb\") " pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.539702 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4134b0be-83c2-452c-a09e-6a699543d2c0-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.550584 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.567767 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.569553 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.573708 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-w6fpc" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.574227 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.582167 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.582456 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb87h\" (UniqueName: \"kubernetes.io/projected/c4755f19-1e55-41bb-be1e-b4b868c48cc1-kube-api-access-bb87h\") pod \"neutron-operator-controller-manager-6b6c55ffd5-nlwm2\" (UID: \"c4755f19-1e55-41bb-be1e-b4b868c48cc1\") " pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.585168 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.587579 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqsg4\" (UniqueName: \"kubernetes.io/projected/cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd-kube-api-access-pqsg4\") pod \"nova-operator-controller-manager-79d658b66d-2ldtk\" (UID: \"cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.588301 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ns4sd" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.617278 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.618533 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644262 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlkkn\" (UniqueName: \"kubernetes.io/projected/562a15b4-c659-490c-88a4-1db388e0224f-kube-api-access-hlkkn\") pod \"placement-operator-controller-manager-867d87977b-5fntg\" (UID: \"562a15b4-c659-490c-88a4-1db388e0224f\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644434 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwtd\" (UniqueName: \"kubernetes.io/projected/4134b0be-83c2-452c-a09e-6a699543d2c0-kube-api-access-6nwtd\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644473 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/301d0455-5622-4810-847e-b354cf6f9c00-kube-api-access-4q825\") pod \"ovn-operator-controller-manager-5b67cfc8fb-vkmzq\" (UID: \"301d0455-5622-4810-847e-b354cf6f9c00\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644510 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd46c\" (UniqueName: \"kubernetes.io/projected/ef5e108a-748a-47ab-b0fc-0a3e303a09ba-kube-api-access-qd46c\") pod \"telemetry-operator-controller-manager-58487d9bf4-tg4ph\" (UID: \"ef5e108a-748a-47ab-b0fc-0a3e303a09ba\") " pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644713 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwj5\" (UniqueName: \"kubernetes.io/projected/e92bf172-fec5-4847-9ba1-9e3ddc58c7c3-kube-api-access-5kwj5\") pod \"swift-operator-controller-manager-cc9f5bc5c-h7svq\" (UID: \"e92bf172-fec5-4847-9ba1-9e3ddc58c7c3\") " pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644883 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27j7c\" (UniqueName: \"kubernetes.io/projected/225f6b3a-93d7-46d9-99a1-d9787b4921fb-kube-api-access-27j7c\") pod \"octavia-operator-controller-manager-7979c68bc7-7fxg5\" (UID: \"225f6b3a-93d7-46d9-99a1-d9787b4921fb\") " pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.644992 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4134b0be-83c2-452c-a09e-6a699543d2c0-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:55 crc kubenswrapper[4922]: E1122 03:06:55.645312 4922 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 03:06:55 crc kubenswrapper[4922]: E1122 03:06:55.645910 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4134b0be-83c2-452c-a09e-6a699543d2c0-cert podName:4134b0be-83c2-452c-a09e-6a699543d2c0 nodeName:}" failed. No retries permitted until 2025-11-22 03:06:56.145883161 +0000 UTC m=+852.184405043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4134b0be-83c2-452c-a09e-6a699543d2c0-cert") pod "openstack-baremetal-operator-controller-manager-77868f484-dsmmt" (UID: "4134b0be-83c2-452c-a09e-6a699543d2c0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.659062 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.660069 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.677948 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j7c\" (UniqueName: \"kubernetes.io/projected/225f6b3a-93d7-46d9-99a1-d9787b4921fb-kube-api-access-27j7c\") pod \"octavia-operator-controller-manager-7979c68bc7-7fxg5\" (UID: \"225f6b3a-93d7-46d9-99a1-d9787b4921fb\") " pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.682063 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwtd\" (UniqueName: \"kubernetes.io/projected/4134b0be-83c2-452c-a09e-6a699543d2c0-kube-api-access-6nwtd\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.729131 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.730151 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.746196 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwj5\" (UniqueName: \"kubernetes.io/projected/e92bf172-fec5-4847-9ba1-9e3ddc58c7c3-kube-api-access-5kwj5\") pod \"swift-operator-controller-manager-cc9f5bc5c-h7svq\" (UID: \"e92bf172-fec5-4847-9ba1-9e3ddc58c7c3\") " pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.746265 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlkkn\" (UniqueName: \"kubernetes.io/projected/562a15b4-c659-490c-88a4-1db388e0224f-kube-api-access-hlkkn\") pod \"placement-operator-controller-manager-867d87977b-5fntg\" (UID: \"562a15b4-c659-490c-88a4-1db388e0224f\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.746300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t7g8\" (UniqueName: \"kubernetes.io/projected/b5d110f6-5ffb-46bc-b263-e7142f463974-kube-api-access-5t7g8\") pod \"test-operator-controller-manager-77db6bf9c-hn2fj\" (UID: \"b5d110f6-5ffb-46bc-b263-e7142f463974\") " pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.746355 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/301d0455-5622-4810-847e-b354cf6f9c00-kube-api-access-4q825\") pod \"ovn-operator-controller-manager-5b67cfc8fb-vkmzq\" (UID: \"301d0455-5622-4810-847e-b354cf6f9c00\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.746377 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd46c\" (UniqueName: \"kubernetes.io/projected/ef5e108a-748a-47ab-b0fc-0a3e303a09ba-kube-api-access-qd46c\") pod \"telemetry-operator-controller-manager-58487d9bf4-tg4ph\" (UID: \"ef5e108a-748a-47ab-b0fc-0a3e303a09ba\") " pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.771363 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q825\" (UniqueName: \"kubernetes.io/projected/301d0455-5622-4810-847e-b354cf6f9c00-kube-api-access-4q825\") pod \"ovn-operator-controller-manager-5b67cfc8fb-vkmzq\" (UID: \"301d0455-5622-4810-847e-b354cf6f9c00\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.772746 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlkkn\" (UniqueName: \"kubernetes.io/projected/562a15b4-c659-490c-88a4-1db388e0224f-kube-api-access-hlkkn\") pod \"placement-operator-controller-manager-867d87977b-5fntg\" (UID: \"562a15b4-c659-490c-88a4-1db388e0224f\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.785288 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.786497 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.790227 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwj5\" (UniqueName: \"kubernetes.io/projected/e92bf172-fec5-4847-9ba1-9e3ddc58c7c3-kube-api-access-5kwj5\") pod \"swift-operator-controller-manager-cc9f5bc5c-h7svq\" (UID: \"e92bf172-fec5-4847-9ba1-9e3ddc58c7c3\") " pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.795609 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-w86mb" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.803906 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd46c\" (UniqueName: \"kubernetes.io/projected/ef5e108a-748a-47ab-b0fc-0a3e303a09ba-kube-api-access-qd46c\") pod \"telemetry-operator-controller-manager-58487d9bf4-tg4ph\" (UID: \"ef5e108a-748a-47ab-b0fc-0a3e303a09ba\") " pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.805583 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.853299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t7g8\" (UniqueName: \"kubernetes.io/projected/b5d110f6-5ffb-46bc-b263-e7142f463974-kube-api-access-5t7g8\") pod \"test-operator-controller-manager-77db6bf9c-hn2fj\" (UID: \"b5d110f6-5ffb-46bc-b263-e7142f463974\") " pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.853891 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwbz8\" (UniqueName: \"kubernetes.io/projected/0bcf2061-04d3-4819-b07e-0eaaf4bb6287-kube-api-access-gwbz8\") pod \"watcher-operator-controller-manager-6b56b8849f-m5xsr\" (UID: \"0bcf2061-04d3-4819-b07e-0eaaf4bb6287\") " pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.873037 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.875462 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.879798 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.880743 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5tv2j" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.881252 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t7g8\" (UniqueName: \"kubernetes.io/projected/b5d110f6-5ffb-46bc-b263-e7142f463974-kube-api-access-5t7g8\") pod \"test-operator-controller-manager-77db6bf9c-hn2fj\" (UID: \"b5d110f6-5ffb-46bc-b263-e7142f463974\") " pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.886032 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.893173 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.894254 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.896385 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-b7c6k" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.914280 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.934820 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth"] Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.942987 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.961920 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwbz8\" (UniqueName: \"kubernetes.io/projected/0bcf2061-04d3-4819-b07e-0eaaf4bb6287-kube-api-access-gwbz8\") pod \"watcher-operator-controller-manager-6b56b8849f-m5xsr\" (UID: \"0bcf2061-04d3-4819-b07e-0eaaf4bb6287\") " pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.962068 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wt8l\" (UniqueName: \"kubernetes.io/projected/302e7f75-5537-48f3-9a19-0540310929da-kube-api-access-5wt8l\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.962105 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/302e7f75-5537-48f3-9a19-0540310929da-cert\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.984274 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:06:55 crc kubenswrapper[4922]: I1122 03:06:55.990792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwbz8\" (UniqueName: \"kubernetes.io/projected/0bcf2061-04d3-4819-b07e-0eaaf4bb6287-kube-api-access-gwbz8\") pod \"watcher-operator-controller-manager-6b56b8849f-m5xsr\" (UID: \"0bcf2061-04d3-4819-b07e-0eaaf4bb6287\") " pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.024945 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" event={"ID":"5db39a57-6021-466f-84e0-1fc2e8cf0da9","Type":"ContainerStarted","Data":"a497158dedd4392189a4cf640d59ad630f9c7be8e976a824d6ea02d9b3335369"} Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.026907 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.045271 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.059714 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.063229 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wt8l\" (UniqueName: \"kubernetes.io/projected/302e7f75-5537-48f3-9a19-0540310929da-kube-api-access-5wt8l\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.063274 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/302e7f75-5537-48f3-9a19-0540310929da-cert\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.063342 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6sb\" (UniqueName: \"kubernetes.io/projected/261901bb-e399-412c-a57f-4fefa2a3bfc0-kube-api-access-lx6sb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-cd74r\" (UID: \"261901bb-e399-412c-a57f-4fefa2a3bfc0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" Nov 22 03:06:56 crc kubenswrapper[4922]: E1122 03:06:56.063669 4922 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 22 03:06:56 crc kubenswrapper[4922]: E1122 03:06:56.063707 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/302e7f75-5537-48f3-9a19-0540310929da-cert podName:302e7f75-5537-48f3-9a19-0540310929da nodeName:}" failed. No retries permitted until 2025-11-22 03:06:56.563691303 +0000 UTC m=+852.602213195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/302e7f75-5537-48f3-9a19-0540310929da-cert") pod "openstack-operator-controller-manager-748d5b5d8d-h2b67" (UID: "302e7f75-5537-48f3-9a19-0540310929da") : secret "webhook-server-cert" not found Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.069297 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7"] Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.085051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wt8l\" (UniqueName: \"kubernetes.io/projected/302e7f75-5537-48f3-9a19-0540310929da-kube-api-access-5wt8l\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.123790 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.165864 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4134b0be-83c2-452c-a09e-6a699543d2c0-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.165946 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6sb\" (UniqueName: \"kubernetes.io/projected/261901bb-e399-412c-a57f-4fefa2a3bfc0-kube-api-access-lx6sb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-cd74r\" (UID: \"261901bb-e399-412c-a57f-4fefa2a3bfc0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.174670 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4134b0be-83c2-452c-a09e-6a699543d2c0-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-dsmmt\" (UID: \"4134b0be-83c2-452c-a09e-6a699543d2c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.182577 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6sb\" (UniqueName: \"kubernetes.io/projected/261901bb-e399-412c-a57f-4fefa2a3bfc0-kube-api-access-lx6sb\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-cd74r\" (UID: \"261901bb-e399-412c-a57f-4fefa2a3bfc0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.276613 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.323415 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.337487 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e52f719_cfcb_48d8_a83f_1bcddb08e6bd.slice/crio-34dbe418d50b8a07fcee4c1ae813ef506f482f4649a98a63679350937c94555c WatchSource:0}: Error finding container 34dbe418d50b8a07fcee4c1ae813ef506f482f4649a98a63679350937c94555c: Status 404 returned error can't find the container with id 34dbe418d50b8a07fcee4c1ae813ef506f482f4649a98a63679350937c94555c Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.411463 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.573428 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/302e7f75-5537-48f3-9a19-0540310929da-cert\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.577782 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.579640 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec6a579d_cf65_4b02_a891_ca17161e6585.slice/crio-5fbe628795cfecf74f6cf19d2935235748e7bd39996a633b58c143cc74e3b382 WatchSource:0}: Error finding container 5fbe628795cfecf74f6cf19d2935235748e7bd39996a633b58c143cc74e3b382: Status 404 returned error can't find the container with id 5fbe628795cfecf74f6cf19d2935235748e7bd39996a633b58c143cc74e3b382 Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.584261 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/302e7f75-5537-48f3-9a19-0540310929da-cert\") pod \"openstack-operator-controller-manager-748d5b5d8d-h2b67\" (UID: \"302e7f75-5537-48f3-9a19-0540310929da\") " pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.589261 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr"] Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.597359 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.605658 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc61f3e_c538_4a90_84da_7cd4760621f1.slice/crio-d9e69bc8b2b54ea85ba05fe37bbb6429b9f9b62953fd0b66db4c5583c97fd52f WatchSource:0}: Error finding container d9e69bc8b2b54ea85ba05fe37bbb6429b9f9b62953fd0b66db4c5583c97fd52f: Status 404 returned error can't find the container with id d9e69bc8b2b54ea85ba05fe37bbb6429b9f9b62953fd0b66db4c5583c97fd52f Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.609621 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.612761 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb44566_f024_43bd_8bc8_7b497606baa7.slice/crio-1eaa2c39937522ea7c73451cdb0f1b53c51f704987173b25507f97358f8c8b0d WatchSource:0}: Error finding container 1eaa2c39937522ea7c73451cdb0f1b53c51f704987173b25507f97358f8c8b0d: Status 404 returned error can't find the container with id 1eaa2c39937522ea7c73451cdb0f1b53c51f704987173b25507f97358f8c8b0d Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.622650 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.625288 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181257ea_c4b9_4370_80b5_7f52ed557c33.slice/crio-b03e7f65bb33f642ef122171456f33d2dfaa9cfa65d7fd7f2050029e47d9de01 WatchSource:0}: Error finding container b03e7f65bb33f642ef122171456f33d2dfaa9cfa65d7fd7f2050029e47d9de01: Status 404 returned error can't find the container with id b03e7f65bb33f642ef122171456f33d2dfaa9cfa65d7fd7f2050029e47d9de01 Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.626587 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cc9e6ed_fd1a_4280_8867_c8fbd326ca14.slice/crio-28503ead2c38e9729a35a15e858e6c3ae770ae48f52152581472ed051921a46c WatchSource:0}: Error finding container 28503ead2c38e9729a35a15e858e6c3ae770ae48f52152581472ed051921a46c: Status 404 returned error can't find the container with id 28503ead2c38e9729a35a15e858e6c3ae770ae48f52152581472ed051921a46c Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.630339 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb"] Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.793665 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp"] Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.801190 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk"] Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.811225 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw"] Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.822611 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.828761 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5db45c_dbc8_4774_ad28_dc4aaa24b9fd.slice/crio-3f88d763b946fc24c3973c5058c8ce4025523c8eea610838d210fa102ca2ceeb WatchSource:0}: Error finding container 3f88d763b946fc24c3973c5058c8ce4025523c8eea610838d210fa102ca2ceeb: Status 404 returned error can't find the container with id 3f88d763b946fc24c3973c5058c8ce4025523c8eea610838d210fa102ca2ceeb Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.829574 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5"] Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.830218 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225f6b3a_93d7_46d9_99a1_d9787b4921fb.slice/crio-90152b9c83f92ab4da88e4f9a4ac7f9e3f3edc6b8741003de1ccb3c8907f32cb WatchSource:0}: Error finding container 90152b9c83f92ab4da88e4f9a4ac7f9e3f3edc6b8741003de1ccb3c8907f32cb: Status 404 returned error can't find the container with id 90152b9c83f92ab4da88e4f9a4ac7f9e3f3edc6b8741003de1ccb3c8907f32cb Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.831604 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4755f19_1e55_41bb_be1e_b4b868c48cc1.slice/crio-d2beaf2c1d0e06fded4fa1f7b908c15fbefc561ecbb8f5c437f99dd76a719d2e WatchSource:0}: Error finding container d2beaf2c1d0e06fded4fa1f7b908c15fbefc561ecbb8f5c437f99dd76a719d2e: Status 404 returned error can't find the container with id d2beaf2c1d0e06fded4fa1f7b908c15fbefc561ecbb8f5c437f99dd76a719d2e Nov 22 03:06:56 crc kubenswrapper[4922]: W1122 03:06:56.835526 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f9c4cd6_8ab3_4895_ab12_74dce3828cf8.slice/crio-81287d085f4a6fb09f9b44bf0b05ddecaf4b8b6c5e6ee0d8c214f6942f3e0e8c WatchSource:0}: Error finding container 81287d085f4a6fb09f9b44bf0b05ddecaf4b8b6c5e6ee0d8c214f6942f3e0e8c: Status 404 returned error can't find the container with id 81287d085f4a6fb09f9b44bf0b05ddecaf4b8b6c5e6ee0d8c214f6942f3e0e8c Nov 22 03:06:56 crc kubenswrapper[4922]: I1122 03:06:56.851659 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.035586 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" event={"ID":"ec6a579d-cf65-4b02-a891-ca17161e6585","Type":"ContainerStarted","Data":"5fbe628795cfecf74f6cf19d2935235748e7bd39996a633b58c143cc74e3b382"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.037261 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" event={"ID":"4bc61f3e-c538-4a90-84da-7cd4760621f1","Type":"ContainerStarted","Data":"d9e69bc8b2b54ea85ba05fe37bbb6429b9f9b62953fd0b66db4c5583c97fd52f"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.039383 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" event={"ID":"181257ea-c4b9-4370-80b5-7f52ed557c33","Type":"ContainerStarted","Data":"b03e7f65bb33f642ef122171456f33d2dfaa9cfa65d7fd7f2050029e47d9de01"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.041027 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" event={"ID":"c4755f19-1e55-41bb-be1e-b4b868c48cc1","Type":"ContainerStarted","Data":"d2beaf2c1d0e06fded4fa1f7b908c15fbefc561ecbb8f5c437f99dd76a719d2e"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.042481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" event={"ID":"0f9c4cd6-8ab3-4895-ab12-74dce3828cf8","Type":"ContainerStarted","Data":"81287d085f4a6fb09f9b44bf0b05ddecaf4b8b6c5e6ee0d8c214f6942f3e0e8c"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.044006 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" event={"ID":"3e52f719-cfcb-48d8-a83f-1bcddb08e6bd","Type":"ContainerStarted","Data":"34dbe418d50b8a07fcee4c1ae813ef506f482f4649a98a63679350937c94555c"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.046760 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" event={"ID":"8b206938-2e76-40b1-b39c-ff333430e8f6","Type":"ContainerStarted","Data":"70d1035b8f1066e95a0f82f00559d48f7baf8f4224cfa2b6a57b1f822cb94489"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.048586 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" event={"ID":"225f6b3a-93d7-46d9-99a1-d9787b4921fb","Type":"ContainerStarted","Data":"90152b9c83f92ab4da88e4f9a4ac7f9e3f3edc6b8741003de1ccb3c8907f32cb"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.049757 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" event={"ID":"1cc9e6ed-fd1a-4280-8867-c8fbd326ca14","Type":"ContainerStarted","Data":"28503ead2c38e9729a35a15e858e6c3ae770ae48f52152581472ed051921a46c"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.052772 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" event={"ID":"cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd","Type":"ContainerStarted","Data":"3f88d763b946fc24c3973c5058c8ce4025523c8eea610838d210fa102ca2ceeb"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.056704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" event={"ID":"8262c40a-af33-42fb-9347-e5d84f97a20d","Type":"ContainerStarted","Data":"f5a96e606569f052055a0b2046f3b621bb0664d2689dc169884ec3e0b674b494"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.058202 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" event={"ID":"459b1df7-6ed9-4ef4-bb71-aa7e82001d5a","Type":"ContainerStarted","Data":"382f79066917d3c2975281b7b8f27e871d690a3a8336f4a295d02298eb0edc9d"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.061636 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" event={"ID":"ddb44566-f024-43bd-8bc8-7b497606baa7","Type":"ContainerStarted","Data":"1eaa2c39937522ea7c73451cdb0f1b53c51f704987173b25507f97358f8c8b0d"} Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.193073 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq"] Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.205733 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-5fntg"] Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.210690 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r"] Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.222323 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq"] Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.227390 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr"] Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.233903 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph"] Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.237881 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj"] Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.247473 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lx6sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-cd74r_openstack-operators(261901bb-e399-412c-a57f-4fefa2a3bfc0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.248585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt"] Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.248655 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" podUID="261901bb-e399-412c-a57f-4fefa2a3bfc0" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.261616 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:624b77b1b44f5e72a6c7d5910b04eb8070c499f83dcf364fb9dc5f2f8cb83c85,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5t7g8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-77db6bf9c-hn2fj_openstack-operators(b5d110f6-5ffb-46bc-b263-e7142f463974): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.263577 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nwtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77868f484-dsmmt_openstack-operators(4134b0be-83c2-452c-a09e-6a699543d2c0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.264238 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7477e2fea70c83cfca71e1ece83bc6fdab55e890db711b0110817a5afd97c591,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qd46c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58487d9bf4-tg4ph_openstack-operators(ef5e108a-748a-47ab-b0fc-0a3e303a09ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.264420 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:fd917de0cf800ec284ee0c3f2906a06d85ea18cb75a5b06c8eb305750467986d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hlkkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-867d87977b-5fntg_openstack-operators(562a15b4-c659-490c-88a4-1db388e0224f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.292395 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bc58f62c7171e9c9216fdeafbd170917b638e6c3f842031ee254f1389c57a09e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5kwj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-cc9f5bc5c-h7svq_openstack-operators(e92bf172-fec5-4847-9ba1-9e3ddc58c7c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 03:06:57 crc kubenswrapper[4922]: I1122 03:06:57.346219 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67"] Nov 22 03:06:57 crc kubenswrapper[4922]: W1122 03:06:57.370283 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302e7f75_5537_48f3_9a19_0540310929da.slice/crio-10d5f8e301b625dbf363d97b91ab54da53783f9f9231c919c738ef5321492446 WatchSource:0}: Error finding container 10d5f8e301b625dbf363d97b91ab54da53783f9f9231c919c738ef5321492446: Status 404 returned error can't find the container with id 10d5f8e301b625dbf363d97b91ab54da53783f9f9231c919c738ef5321492446 Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.681025 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" podUID="ef5e108a-748a-47ab-b0fc-0a3e303a09ba" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.681047 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" podUID="4134b0be-83c2-452c-a09e-6a699543d2c0" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.684686 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" podUID="e92bf172-fec5-4847-9ba1-9e3ddc58c7c3" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.694636 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" podUID="b5d110f6-5ffb-46bc-b263-e7142f463974" Nov 22 03:06:57 crc kubenswrapper[4922]: E1122 03:06:57.707506 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" podUID="562a15b4-c659-490c-88a4-1db388e0224f" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.085804 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" event={"ID":"ef5e108a-748a-47ab-b0fc-0a3e303a09ba","Type":"ContainerStarted","Data":"1850a3a0a1e1f9ba27fddb5a477191186de0322ed8df4cb73242b7da45000846"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.085864 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" event={"ID":"ef5e108a-748a-47ab-b0fc-0a3e303a09ba","Type":"ContainerStarted","Data":"6dc6a352dd6bb7928a1ebaef035a65869226aa81dfb560144063cbbf06db7d34"} Nov 22 03:06:58 crc kubenswrapper[4922]: E1122 03:06:58.090040 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7477e2fea70c83cfca71e1ece83bc6fdab55e890db711b0110817a5afd97c591\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" podUID="ef5e108a-748a-47ab-b0fc-0a3e303a09ba" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.094351 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" event={"ID":"4134b0be-83c2-452c-a09e-6a699543d2c0","Type":"ContainerStarted","Data":"53cbaf481257dfa5a9e4df4f09fb0306a1912aad93e8af0398fa9c36e9f05d6f"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.094408 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" event={"ID":"4134b0be-83c2-452c-a09e-6a699543d2c0","Type":"ContainerStarted","Data":"efca479e15fa66bc0c4cc2865441584449ebc57a8264a02809ad2e50c951a578"} Nov 22 03:06:58 crc kubenswrapper[4922]: E1122 03:06:58.096637 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" podUID="4134b0be-83c2-452c-a09e-6a699543d2c0" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.098600 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" event={"ID":"562a15b4-c659-490c-88a4-1db388e0224f","Type":"ContainerStarted","Data":"8ba21876ad1b569b9f9bc47f24a1d66be14630de604da7d655b46cecdf708d1a"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.098638 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" event={"ID":"562a15b4-c659-490c-88a4-1db388e0224f","Type":"ContainerStarted","Data":"51dd1363773d9956c5bfba9133841662c738610a0d15d4aeec5241ce264c578a"} Nov 22 03:06:58 crc kubenswrapper[4922]: E1122 03:06:58.100577 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:fd917de0cf800ec284ee0c3f2906a06d85ea18cb75a5b06c8eb305750467986d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" podUID="562a15b4-c659-490c-88a4-1db388e0224f" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.111954 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" event={"ID":"301d0455-5622-4810-847e-b354cf6f9c00","Type":"ContainerStarted","Data":"94045fad92bc74b6281f0f7e08adec64f211915317089b4e444d333b0335c141"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.121295 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" event={"ID":"0bcf2061-04d3-4819-b07e-0eaaf4bb6287","Type":"ContainerStarted","Data":"b1eed7fb844220d3fc6b127390e04e77a460090c2ae038356a527cf14b6931c9"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.128022 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" event={"ID":"302e7f75-5537-48f3-9a19-0540310929da","Type":"ContainerStarted","Data":"ec1b28c8ce30170e1cbb5ef0ea667377b1ef25e89a930676182fc86423da4c23"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.128061 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" event={"ID":"302e7f75-5537-48f3-9a19-0540310929da","Type":"ContainerStarted","Data":"20c469c08b92cf12a7c94dfc3d61db8c4acd137210968ced7275bc4a4971e0be"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.128071 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" event={"ID":"302e7f75-5537-48f3-9a19-0540310929da","Type":"ContainerStarted","Data":"10d5f8e301b625dbf363d97b91ab54da53783f9f9231c919c738ef5321492446"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.128825 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.133248 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" event={"ID":"b5d110f6-5ffb-46bc-b263-e7142f463974","Type":"ContainerStarted","Data":"f300a58b1692d4ee27a2fa34d5f16b640123987e5e7652274276901f170e7df5"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.133280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" event={"ID":"b5d110f6-5ffb-46bc-b263-e7142f463974","Type":"ContainerStarted","Data":"59df2bcbd2bcd5a6c2c7aa06b12d4987810130960361709a88e1ebf526ab1115"} Nov 22 03:06:58 crc kubenswrapper[4922]: E1122 03:06:58.134616 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:624b77b1b44f5e72a6c7d5910b04eb8070c499f83dcf364fb9dc5f2f8cb83c85\\\"\"" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" podUID="b5d110f6-5ffb-46bc-b263-e7142f463974" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.154350 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" event={"ID":"e92bf172-fec5-4847-9ba1-9e3ddc58c7c3","Type":"ContainerStarted","Data":"5c9e9c7f4e5cda2a8310aaf8c748880815c3a1c766b338d7ca23d6f5fcc62b46"} Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.154395 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" event={"ID":"e92bf172-fec5-4847-9ba1-9e3ddc58c7c3","Type":"ContainerStarted","Data":"fff5d4fc249b532c7d7fb409fdd5416b3334537a8267fb1fba80422ed9b38f18"} Nov 22 03:06:58 crc kubenswrapper[4922]: E1122 03:06:58.167753 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bc58f62c7171e9c9216fdeafbd170917b638e6c3f842031ee254f1389c57a09e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" podUID="e92bf172-fec5-4847-9ba1-9e3ddc58c7c3" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.185743 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" event={"ID":"261901bb-e399-412c-a57f-4fefa2a3bfc0","Type":"ContainerStarted","Data":"6e2878e47e58d71cdcec2e0bef98f05c8a2d1d08c3f9724a86add00c55106a74"} Nov 22 03:06:58 crc kubenswrapper[4922]: E1122 03:06:58.187098 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" podUID="261901bb-e399-412c-a57f-4fefa2a3bfc0" Nov 22 03:06:58 crc kubenswrapper[4922]: I1122 03:06:58.214592 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" podStartSLOduration=3.214573301 podStartE2EDuration="3.214573301s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:06:58.207120682 +0000 UTC m=+854.245642574" watchObservedRunningTime="2025-11-22 03:06:58.214573301 +0000 UTC m=+854.253095193" Nov 22 03:06:59 crc kubenswrapper[4922]: E1122 03:06:59.198201 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" podUID="4134b0be-83c2-452c-a09e-6a699543d2c0" Nov 22 03:06:59 crc kubenswrapper[4922]: E1122 03:06:59.198598 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bc58f62c7171e9c9216fdeafbd170917b638e6c3f842031ee254f1389c57a09e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" podUID="e92bf172-fec5-4847-9ba1-9e3ddc58c7c3" Nov 22 03:06:59 crc kubenswrapper[4922]: E1122 03:06:59.198640 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7477e2fea70c83cfca71e1ece83bc6fdab55e890db711b0110817a5afd97c591\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" podUID="ef5e108a-748a-47ab-b0fc-0a3e303a09ba" Nov 22 03:06:59 crc kubenswrapper[4922]: E1122 03:06:59.198674 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:fd917de0cf800ec284ee0c3f2906a06d85ea18cb75a5b06c8eb305750467986d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" podUID="562a15b4-c659-490c-88a4-1db388e0224f" Nov 22 03:06:59 crc kubenswrapper[4922]: E1122 03:06:59.198720 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" podUID="261901bb-e399-412c-a57f-4fefa2a3bfc0" Nov 22 03:06:59 crc kubenswrapper[4922]: E1122 03:06:59.198755 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:624b77b1b44f5e72a6c7d5910b04eb8070c499f83dcf364fb9dc5f2f8cb83c85\\\"\"" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" podUID="b5d110f6-5ffb-46bc-b263-e7142f463974" Nov 22 03:07:03 crc kubenswrapper[4922]: I1122 03:07:03.855638 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ldtxd"] Nov 22 03:07:03 crc kubenswrapper[4922]: I1122 03:07:03.858511 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:03 crc kubenswrapper[4922]: I1122 03:07:03.860260 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldtxd"] Nov 22 03:07:03 crc kubenswrapper[4922]: I1122 03:07:03.912423 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-catalog-content\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:03 crc kubenswrapper[4922]: I1122 03:07:03.912624 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-utilities\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:03 crc kubenswrapper[4922]: I1122 03:07:03.912666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pzzq\" (UniqueName: \"kubernetes.io/projected/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-kube-api-access-4pzzq\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.014089 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-utilities\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.014168 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pzzq\" (UniqueName: \"kubernetes.io/projected/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-kube-api-access-4pzzq\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.014205 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-catalog-content\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.014787 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-catalog-content\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.015203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-utilities\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.051179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pzzq\" (UniqueName: \"kubernetes.io/projected/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-kube-api-access-4pzzq\") pod \"redhat-operators-ldtxd\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:04 crc kubenswrapper[4922]: I1122 03:07:04.188777 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:06 crc kubenswrapper[4922]: I1122 03:07:06.860395 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-748d5b5d8d-h2b67" Nov 22 03:07:09 crc kubenswrapper[4922]: E1122 03:07:09.520149 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a" Nov 22 03:07:09 crc kubenswrapper[4922]: E1122 03:07:09.520705 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwbz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b56b8849f-m5xsr_openstack-operators(0bcf2061-04d3-4819-b07e-0eaaf4bb6287): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:07:10 crc kubenswrapper[4922]: E1122 03:07:10.300022 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/openstack-k8s-operators/cinder-operator:db6ca023b9893fe1ac2b2c2a06357543da4ee150" Nov 22 03:07:10 crc kubenswrapper[4922]: E1122 03:07:10.300534 4922 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.66:5001/openstack-k8s-operators/cinder-operator:db6ca023b9893fe1ac2b2c2a06357543da4ee150" Nov 22 03:07:10 crc kubenswrapper[4922]: E1122 03:07:10.300792 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.66:5001/openstack-k8s-operators/cinder-operator:db6ca023b9893fe1ac2b2c2a06357543da4ee150,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6bw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-864d88ccf8-vc6p7_openstack-operators(8b206938-2e76-40b1-b39c-ff333430e8f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:07:10 crc kubenswrapper[4922]: E1122 03:07:10.613544 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" podUID="0bcf2061-04d3-4819-b07e-0eaaf4bb6287" Nov 22 03:07:10 crc kubenswrapper[4922]: I1122 03:07:10.790948 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ldtxd"] Nov 22 03:07:10 crc kubenswrapper[4922]: W1122 03:07:10.816995 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3748cc7_fc66_4422_b941_8ac33d7a1ac0.slice/crio-dc81d9981ab0261db7c43c5b789ab323897880d8ee51009c0712f010fff1440f WatchSource:0}: Error finding container dc81d9981ab0261db7c43c5b789ab323897880d8ee51009c0712f010fff1440f: Status 404 returned error can't find the container with id dc81d9981ab0261db7c43c5b789ab323897880d8ee51009c0712f010fff1440f Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.109963 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.110081 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:07:11 crc kubenswrapper[4922]: E1122 03:07:11.330143 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" podUID="0bcf2061-04d3-4819-b07e-0eaaf4bb6287" Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.337729 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" event={"ID":"0bcf2061-04d3-4819-b07e-0eaaf4bb6287","Type":"ContainerStarted","Data":"68bd0a1a10a61f5ec4c3cc9e86cc021c9451184b35e22e321b31efadd6b0ceb5"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.337857 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerStarted","Data":"dc81d9981ab0261db7c43c5b789ab323897880d8ee51009c0712f010fff1440f"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.357861 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" event={"ID":"301d0455-5622-4810-847e-b354cf6f9c00","Type":"ContainerStarted","Data":"b3d1476f87351bfb5e90054836cb0b8eb49757a5ef8904099bc1e4e566fd7964"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.382138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" event={"ID":"459b1df7-6ed9-4ef4-bb71-aa7e82001d5a","Type":"ContainerStarted","Data":"97c21d709149d8f75b6153bdd9b844916309f1d7208ff3c649a071fe4402ee83"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.406435 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" event={"ID":"0f9c4cd6-8ab3-4895-ab12-74dce3828cf8","Type":"ContainerStarted","Data":"5f99c40878077afc1675c7fc327bf909ca50761c855b4ed7f355e3880d5c0263"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.417441 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" event={"ID":"ec6a579d-cf65-4b02-a891-ca17161e6585","Type":"ContainerStarted","Data":"274ef04d78471f4ce3fa8f658afda4ca344b59659965d74fa01113fff400ece9"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.426580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" event={"ID":"5db39a57-6021-466f-84e0-1fc2e8cf0da9","Type":"ContainerStarted","Data":"a558fe8b09a74d76d806a8fe074c9d58ae58b762e76dba56f70a5c14c861cde4"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.434919 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" event={"ID":"1cc9e6ed-fd1a-4280-8867-c8fbd326ca14","Type":"ContainerStarted","Data":"51f9fea6be7de24fa2b1fc66eeb353e018d6a1f51b8552283d8bd5c575b88425"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.436323 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" event={"ID":"181257ea-c4b9-4370-80b5-7f52ed557c33","Type":"ContainerStarted","Data":"8430ece6ba78ae87d6c4799ba7968715865c76ca6260a2ccdf30886b619aa255"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.447020 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" event={"ID":"ddb44566-f024-43bd-8bc8-7b497606baa7","Type":"ContainerStarted","Data":"0d965dd5b7670b39cc3170bdde279c0fdef7ddb8142828a8448a1626c02e1680"} Nov 22 03:07:11 crc kubenswrapper[4922]: I1122 03:07:11.489220 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" event={"ID":"3e52f719-cfcb-48d8-a83f-1bcddb08e6bd","Type":"ContainerStarted","Data":"3c99e53124ffb3f257ac6c4c112167cb2457748d9f544a05b995cbf490da3022"} Nov 22 03:07:12 crc kubenswrapper[4922]: I1122 03:07:12.499507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" event={"ID":"4bc61f3e-c538-4a90-84da-7cd4760621f1","Type":"ContainerStarted","Data":"9f3e4a0ddccab5ba33fd3e9d12bb9dafa8c13ff25bd7231eb799d84f49133f5c"} Nov 22 03:07:12 crc kubenswrapper[4922]: I1122 03:07:12.501917 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" event={"ID":"cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd","Type":"ContainerStarted","Data":"74dd5ce0c7a2b00816fbedcd28fb5cb0a7d2901c41aa24bff9f0e23390c18a38"} Nov 22 03:07:12 crc kubenswrapper[4922]: I1122 03:07:12.503513 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" event={"ID":"8262c40a-af33-42fb-9347-e5d84f97a20d","Type":"ContainerStarted","Data":"cd5539482a809443fc421a9e41923e4ea4ea18f34321b4801dbfdf9bb220c491"} Nov 22 03:07:12 crc kubenswrapper[4922]: I1122 03:07:12.506160 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" event={"ID":"c4755f19-1e55-41bb-be1e-b4b868c48cc1","Type":"ContainerStarted","Data":"3aa66f1461c7b0b9196b14b40597f426f299f0476e89b82c4efb5c74d8594a25"} Nov 22 03:07:12 crc kubenswrapper[4922]: I1122 03:07:12.510674 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" event={"ID":"8b206938-2e76-40b1-b39c-ff333430e8f6","Type":"ContainerStarted","Data":"829a9c8a374c58a8a7813f32320a20b6f7196c5456bde146b6458dd7ba8dfde6"} Nov 22 03:07:12 crc kubenswrapper[4922]: E1122 03:07:12.514323 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:1988aaf9cd245150cda123aaaa21718ccb552c47f1623b7d68804f13c47f2c6a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" podUID="0bcf2061-04d3-4819-b07e-0eaaf4bb6287" Nov 22 03:07:14 crc kubenswrapper[4922]: I1122 03:07:14.530805 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" event={"ID":"0f9c4cd6-8ab3-4895-ab12-74dce3828cf8","Type":"ContainerStarted","Data":"04dfcd457fa2937d6bc226b68cb76f103e7d0b7e76a3ea2868d5dc96a7b699df"} Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.602732 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5ftvr"] Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.605752 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.620033 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ftvr"] Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.695345 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjjh\" (UniqueName: \"kubernetes.io/projected/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-kube-api-access-lmjjh\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.695409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-utilities\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.695460 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-catalog-content\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.797095 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjjh\" (UniqueName: \"kubernetes.io/projected/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-kube-api-access-lmjjh\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.797161 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-utilities\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.797188 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-catalog-content\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.797941 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-utilities\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.798009 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-catalog-content\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.836206 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjjh\" (UniqueName: \"kubernetes.io/projected/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-kube-api-access-lmjjh\") pod \"certified-operators-5ftvr\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:15 crc kubenswrapper[4922]: I1122 03:07:15.934009 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.431587 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5ftvr"] Nov 22 03:07:16 crc kubenswrapper[4922]: E1122 03:07:16.486369 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" podUID="8b206938-2e76-40b1-b39c-ff333430e8f6" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.585728 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" event={"ID":"459b1df7-6ed9-4ef4-bb71-aa7e82001d5a","Type":"ContainerStarted","Data":"772f395a677a61f9d4820a427ae06847d0b6b1f32b61695a8cb647832ead1d26"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.587054 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.594296 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.595460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" event={"ID":"ec6a579d-cf65-4b02-a891-ca17161e6585","Type":"ContainerStarted","Data":"ea551029fdd30229a82060513789d94e57a8df6d69a376cc3ab0e6537bf46ce3"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.596254 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.610863 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-h7jpp" podStartSLOduration=8.138791199 podStartE2EDuration="21.610818989s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.790342579 +0000 UTC m=+852.828864471" lastFinishedPulling="2025-11-22 03:07:10.262370359 +0000 UTC m=+866.300892261" observedRunningTime="2025-11-22 03:07:16.608179755 +0000 UTC m=+872.646701647" watchObservedRunningTime="2025-11-22 03:07:16.610818989 +0000 UTC m=+872.649340871" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.613226 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.620037 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" event={"ID":"8262c40a-af33-42fb-9347-e5d84f97a20d","Type":"ContainerStarted","Data":"d8a466c2888388be8d123299068d10a95c842407ff2c3bf323848b4a79817269"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.620314 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.623109 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.624193 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" event={"ID":"c4755f19-1e55-41bb-be1e-b4b868c48cc1","Type":"ContainerStarted","Data":"9623c20dcea371d2c21aaf3f1446d365734f5194168a7dbb7387bcd8f8d3c513"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.625071 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.639643 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-twhcc" podStartSLOduration=9.060644531 podStartE2EDuration="22.639613491s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.597261978 +0000 UTC m=+852.635783870" lastFinishedPulling="2025-11-22 03:07:10.176230928 +0000 UTC m=+866.214752830" observedRunningTime="2025-11-22 03:07:16.634395196 +0000 UTC m=+872.672917088" watchObservedRunningTime="2025-11-22 03:07:16.639613491 +0000 UTC m=+872.678135383" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.642734 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.646199 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerStarted","Data":"6bf33224b8b934211faacdae34e729ba9cd0fda37d650a6f3eb072cd580cd63a"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.677109 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" event={"ID":"301d0455-5622-4810-847e-b354cf6f9c00","Type":"ContainerStarted","Data":"12ac91b89132b36a3899f65cfd603ee57af476786ee1291f60bf9597cb058adf"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.679172 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.685160 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.699205 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6b6c55ffd5-nlwm2" podStartSLOduration=8.236440236 podStartE2EDuration="21.699163922s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.834985862 +0000 UTC m=+852.873507754" lastFinishedPulling="2025-11-22 03:07:10.297709548 +0000 UTC m=+866.336231440" observedRunningTime="2025-11-22 03:07:16.685941215 +0000 UTC m=+872.724463117" watchObservedRunningTime="2025-11-22 03:07:16.699163922 +0000 UTC m=+872.737685814" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.708206 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" event={"ID":"1cc9e6ed-fd1a-4280-8867-c8fbd326ca14","Type":"ContainerStarted","Data":"50ccc20f0feda73cd31e574812274314af27029849968b7e632bbc8469d8aa21"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.708343 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.719953 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-v6hmr" podStartSLOduration=9.052083315 podStartE2EDuration="22.719935922s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.609766729 +0000 UTC m=+852.648288621" lastFinishedPulling="2025-11-22 03:07:10.277619336 +0000 UTC m=+866.316141228" observedRunningTime="2025-11-22 03:07:16.707152394 +0000 UTC m=+872.745674286" watchObservedRunningTime="2025-11-22 03:07:16.719935922 +0000 UTC m=+872.758457814" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.722112 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.734334 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" event={"ID":"181257ea-c4b9-4370-80b5-7f52ed557c33","Type":"ContainerStarted","Data":"348343b5925e9ca87c40ad1cc37d73cf312e384ca05968fc9f033e7664196e4e"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.736469 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.743933 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.771160 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerStarted","Data":"f957cae6b1fcf26b04ab8925f69df8f7321aa1a4d833e4182644d791aae56e4f"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.777886 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-4lfxl" podStartSLOduration=9.10230927 podStartE2EDuration="22.777856453s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.629991574 +0000 UTC m=+852.668513466" lastFinishedPulling="2025-11-22 03:07:10.305538757 +0000 UTC m=+866.344060649" observedRunningTime="2025-11-22 03:07:16.764246397 +0000 UTC m=+872.802768289" watchObservedRunningTime="2025-11-22 03:07:16.777856453 +0000 UTC m=+872.816378345" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.799107 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-6bd966bbd4-g9fbb" podStartSLOduration=9.167761674 podStartE2EDuration="22.799076664s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.630983548 +0000 UTC m=+852.669505430" lastFinishedPulling="2025-11-22 03:07:10.262298528 +0000 UTC m=+866.300820420" observedRunningTime="2025-11-22 03:07:16.793013638 +0000 UTC m=+872.831535520" watchObservedRunningTime="2025-11-22 03:07:16.799076664 +0000 UTC m=+872.837598556" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.829536 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" event={"ID":"ddb44566-f024-43bd-8bc8-7b497606baa7","Type":"ContainerStarted","Data":"c73f0df21ff2fd6704acdec2fb959a2d279ef7f13bf9afe1d6992f5095c5ea13"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.830262 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.837686 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.843087 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-vkmzq" podStartSLOduration=8.80866847 podStartE2EDuration="21.843066141s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.227919217 +0000 UTC m=+853.266441109" lastFinishedPulling="2025-11-22 03:07:10.262316888 +0000 UTC m=+866.300838780" observedRunningTime="2025-11-22 03:07:16.822775893 +0000 UTC m=+872.861297785" watchObservedRunningTime="2025-11-22 03:07:16.843066141 +0000 UTC m=+872.881588033" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.864705 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" event={"ID":"3e52f719-cfcb-48d8-a83f-1bcddb08e6bd","Type":"ContainerStarted","Data":"bc460e6d9cccab3e0748cf5568e9650163af8545ece6eef674127e3906db5123"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.866086 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.871293 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.876755 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" event={"ID":"225f6b3a-93d7-46d9-99a1-d9787b4921fb","Type":"ContainerStarted","Data":"b9e504ac3cee79a63f14a25351287225cc77394fe0568d96c3ffbbdc90241d37"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.884800 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" event={"ID":"5db39a57-6021-466f-84e0-1fc2e8cf0da9","Type":"ContainerStarted","Data":"b453ecf525cf708c72cbc074a1d2c3245c79c6bc39e374b82b056844538c9208"} Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.884877 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.884893 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.890709 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" Nov 22 03:07:16 crc kubenswrapper[4922]: E1122 03:07:16.890776 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.66:5001/openstack-k8s-operators/cinder-operator:db6ca023b9893fe1ac2b2c2a06357543da4ee150\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" podUID="8b206938-2e76-40b1-b39c-ff333430e8f6" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.896174 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.983872 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-rxzs6" podStartSLOduration=9.045358553 podStartE2EDuration="22.983816784s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.339416851 +0000 UTC m=+852.377938743" lastFinishedPulling="2025-11-22 03:07:10.277875082 +0000 UTC m=+866.316396974" observedRunningTime="2025-11-22 03:07:16.949133791 +0000 UTC m=+872.987655683" watchObservedRunningTime="2025-11-22 03:07:16.983816784 +0000 UTC m=+873.022338676" Nov 22 03:07:16 crc kubenswrapper[4922]: I1122 03:07:16.993694 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-dkfth" podStartSLOduration=8.794254408 podStartE2EDuration="22.993671481s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:55.975033623 +0000 UTC m=+852.013555525" lastFinishedPulling="2025-11-22 03:07:10.174450666 +0000 UTC m=+866.212972598" observedRunningTime="2025-11-22 03:07:16.979932961 +0000 UTC m=+873.018454853" watchObservedRunningTime="2025-11-22 03:07:16.993671481 +0000 UTC m=+873.032193373" Nov 22 03:07:17 crc kubenswrapper[4922]: I1122 03:07:17.045178 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-646fd589f9-f7bz2" podStartSLOduration=8.400213531 podStartE2EDuration="22.045151608s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.618602441 +0000 UTC m=+852.657124333" lastFinishedPulling="2025-11-22 03:07:10.263540518 +0000 UTC m=+866.302062410" observedRunningTime="2025-11-22 03:07:17.023773004 +0000 UTC m=+873.062294896" watchObservedRunningTime="2025-11-22 03:07:17.045151608 +0000 UTC m=+873.083673500" Nov 22 03:07:17 crc kubenswrapper[4922]: I1122 03:07:17.062514 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-54485f899-vg7qw" podStartSLOduration=9.74792381 podStartE2EDuration="23.062495526s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.858542848 +0000 UTC m=+852.897064740" lastFinishedPulling="2025-11-22 03:07:10.173114554 +0000 UTC m=+866.211636456" observedRunningTime="2025-11-22 03:07:17.053718424 +0000 UTC m=+873.092240316" watchObservedRunningTime="2025-11-22 03:07:17.062495526 +0000 UTC m=+873.101017418" Nov 22 03:07:17 crc kubenswrapper[4922]: I1122 03:07:17.891410 4922 generic.go:334] "Generic (PLEG): container finished" podID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerID="f957cae6b1fcf26b04ab8925f69df8f7321aa1a4d833e4182644d791aae56e4f" exitCode=0 Nov 22 03:07:17 crc kubenswrapper[4922]: I1122 03:07:17.891497 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerDied","Data":"f957cae6b1fcf26b04ab8925f69df8f7321aa1a4d833e4182644d791aae56e4f"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.917528 4922 generic.go:334] "Generic (PLEG): container finished" podID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerID="7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd" exitCode=0 Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.917697 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerDied","Data":"7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.921985 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" event={"ID":"cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd","Type":"ContainerStarted","Data":"457fa2edcd9b8dd339129a6bf2ce6255eafb29f165728812f73812c3d9e94bb3"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.922371 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.930285 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.939236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerStarted","Data":"7bd3e83e19836025108883495f80927bed0c80e289866e3710b4a94887d4aff4"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.958866 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" event={"ID":"b5d110f6-5ffb-46bc-b263-e7142f463974","Type":"ContainerStarted","Data":"d9d720fc9eea20674638625f545708725e9fff407622f4a6821c342b8fcd653d"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.959772 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.961623 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-2ldtk" podStartSLOduration=11.568221137 podStartE2EDuration="24.961603737s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.869812369 +0000 UTC m=+852.908334261" lastFinishedPulling="2025-11-22 03:07:10.263194969 +0000 UTC m=+866.301716861" observedRunningTime="2025-11-22 03:07:19.958932313 +0000 UTC m=+875.997454225" watchObservedRunningTime="2025-11-22 03:07:19.961603737 +0000 UTC m=+876.000125629" Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.973052 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" event={"ID":"4bc61f3e-c538-4a90-84da-7cd4760621f1","Type":"ContainerStarted","Data":"7d740a52d299cc5a82257ef1bce7b5786334536261ce90535afad459e4e690fa"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.973979 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.976011 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.993268 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" event={"ID":"e92bf172-fec5-4847-9ba1-9e3ddc58c7c3","Type":"ContainerStarted","Data":"28d9cd3b166e7eb14b719753724ecb361fcacf1b39e7814922fd34cc8d76c5b3"} Nov 22 03:07:19 crc kubenswrapper[4922]: I1122 03:07:19.994067 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:07:20 crc kubenswrapper[4922]: I1122 03:07:19.998613 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" event={"ID":"225f6b3a-93d7-46d9-99a1-d9787b4921fb","Type":"ContainerStarted","Data":"19181c67f6c1e9c920dafd3ec2864931fb2eee16a4762c0697af585910a0445a"} Nov 22 03:07:20 crc kubenswrapper[4922]: I1122 03:07:19.999270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:07:20 crc kubenswrapper[4922]: I1122 03:07:20.020110 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" podStartSLOduration=2.7418753799999998 podStartE2EDuration="25.020093583s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.261494794 +0000 UTC m=+853.300016686" lastFinishedPulling="2025-11-22 03:07:19.539712997 +0000 UTC m=+875.578234889" observedRunningTime="2025-11-22 03:07:20.012384149 +0000 UTC m=+876.050906041" watchObservedRunningTime="2025-11-22 03:07:20.020093583 +0000 UTC m=+876.058615475" Nov 22 03:07:20 crc kubenswrapper[4922]: I1122 03:07:20.042724 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7d6f5d799-5szbr" podStartSLOduration=12.347500412 podStartE2EDuration="26.042707887s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.610320112 +0000 UTC m=+852.648842004" lastFinishedPulling="2025-11-22 03:07:10.305527587 +0000 UTC m=+866.344049479" observedRunningTime="2025-11-22 03:07:20.042216075 +0000 UTC m=+876.080737977" watchObservedRunningTime="2025-11-22 03:07:20.042707887 +0000 UTC m=+876.081229779" Nov 22 03:07:20 crc kubenswrapper[4922]: I1122 03:07:20.144972 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" podStartSLOduration=11.701916439 podStartE2EDuration="25.144950724s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.834232573 +0000 UTC m=+852.872754465" lastFinishedPulling="2025-11-22 03:07:10.277266858 +0000 UTC m=+866.315788750" observedRunningTime="2025-11-22 03:07:20.143231143 +0000 UTC m=+876.181753025" watchObservedRunningTime="2025-11-22 03:07:20.144950724 +0000 UTC m=+876.183472616" Nov 22 03:07:20 crc kubenswrapper[4922]: I1122 03:07:20.147123 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" podStartSLOduration=2.883301849 podStartE2EDuration="25.147115837s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.292245762 +0000 UTC m=+853.330767644" lastFinishedPulling="2025-11-22 03:07:19.55605974 +0000 UTC m=+875.594581632" observedRunningTime="2025-11-22 03:07:20.119444941 +0000 UTC m=+876.157966853" watchObservedRunningTime="2025-11-22 03:07:20.147115837 +0000 UTC m=+876.185637729" Nov 22 03:07:21 crc kubenswrapper[4922]: I1122 03:07:21.010885 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerDied","Data":"7bd3e83e19836025108883495f80927bed0c80e289866e3710b4a94887d4aff4"} Nov 22 03:07:21 crc kubenswrapper[4922]: I1122 03:07:21.010789 4922 generic.go:334] "Generic (PLEG): container finished" podID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerID="7bd3e83e19836025108883495f80927bed0c80e289866e3710b4a94887d4aff4" exitCode=0 Nov 22 03:07:21 crc kubenswrapper[4922]: I1122 03:07:21.015670 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerStarted","Data":"b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0"} Nov 22 03:07:21 crc kubenswrapper[4922]: I1122 03:07:21.021375 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7979c68bc7-7fxg5" Nov 22 03:07:22 crc kubenswrapper[4922]: I1122 03:07:22.025283 4922 generic.go:334] "Generic (PLEG): container finished" podID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerID="b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0" exitCode=0 Nov 22 03:07:22 crc kubenswrapper[4922]: I1122 03:07:22.025394 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerDied","Data":"b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.031985 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-cc9f5bc5c-h7svq" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.062928 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-77db6bf9c-hn2fj" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.092180 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerStarted","Data":"ab85f9eedd0d05d5a5ac721422cd24cf9c8df61db55a4cba6bc2cb6bfc3c2a68"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.094750 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" event={"ID":"562a15b4-c659-490c-88a4-1db388e0224f","Type":"ContainerStarted","Data":"95c3b1b05b28b45c3f2bc373734e7ff7e12b83bb9b5f7ee96f0969062af42b3b"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.094902 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.097319 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerStarted","Data":"c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.098815 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" event={"ID":"261901bb-e399-412c-a57f-4fefa2a3bfc0","Type":"ContainerStarted","Data":"5e6cf814b58688c5e2a89f16ede1958aba61ee941cc27c49e2ec3dc5924fb88a"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.100223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" event={"ID":"ef5e108a-748a-47ab-b0fc-0a3e303a09ba","Type":"ContainerStarted","Data":"250cf469edb418309a3e399c64d84dc5729f8dc874fdd3a3dc62a6d50d3d090f"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.100400 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.102946 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" event={"ID":"4134b0be-83c2-452c-a09e-6a699543d2c0","Type":"ContainerStarted","Data":"8efc25c2cf849ec626b812b82078467ce5b5fa0ee1a4a8fc4191d94e03df3099"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.103108 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.105322 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" event={"ID":"0bcf2061-04d3-4819-b07e-0eaaf4bb6287","Type":"ContainerStarted","Data":"6fae899e3fc1ba6b2f4bf8bb13a03fb47afa73ae134ae1480bfeb310c274aff0"} Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.105461 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.127796 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ldtxd" podStartSLOduration=16.618436998 podStartE2EDuration="23.127770505s" podCreationTimestamp="2025-11-22 03:07:03 +0000 UTC" firstStartedPulling="2025-11-22 03:07:18.229908894 +0000 UTC m=+874.268430786" lastFinishedPulling="2025-11-22 03:07:24.739242401 +0000 UTC m=+880.777764293" observedRunningTime="2025-11-22 03:07:26.12711555 +0000 UTC m=+882.165637442" watchObservedRunningTime="2025-11-22 03:07:26.127770505 +0000 UTC m=+882.166292397" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.163676 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" podStartSLOduration=3.700615462 podStartE2EDuration="31.163650137s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.263997443 +0000 UTC m=+853.302519335" lastFinishedPulling="2025-11-22 03:07:24.727032118 +0000 UTC m=+880.765554010" observedRunningTime="2025-11-22 03:07:26.158625496 +0000 UTC m=+882.197147398" watchObservedRunningTime="2025-11-22 03:07:26.163650137 +0000 UTC m=+882.202172029" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.178231 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" podStartSLOduration=3.731239629 podStartE2EDuration="31.178204067s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.264298901 +0000 UTC m=+853.302820793" lastFinishedPulling="2025-11-22 03:07:24.711263339 +0000 UTC m=+880.749785231" observedRunningTime="2025-11-22 03:07:26.177517181 +0000 UTC m=+882.216039083" watchObservedRunningTime="2025-11-22 03:07:26.178204067 +0000 UTC m=+882.216725959" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.246738 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5ftvr" podStartSLOduration=6.419250043 podStartE2EDuration="11.246710014s" podCreationTimestamp="2025-11-22 03:07:15 +0000 UTC" firstStartedPulling="2025-11-22 03:07:19.919411363 +0000 UTC m=+875.957933255" lastFinishedPulling="2025-11-22 03:07:24.746871334 +0000 UTC m=+880.785393226" observedRunningTime="2025-11-22 03:07:26.245706109 +0000 UTC m=+882.284228011" watchObservedRunningTime="2025-11-22 03:07:26.246710014 +0000 UTC m=+882.285231916" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.248237 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-cd74r" podStartSLOduration=3.750363298 podStartE2EDuration="31.24822258s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.247361863 +0000 UTC m=+853.285883745" lastFinishedPulling="2025-11-22 03:07:24.745221135 +0000 UTC m=+880.783743027" observedRunningTime="2025-11-22 03:07:26.206764324 +0000 UTC m=+882.245286226" watchObservedRunningTime="2025-11-22 03:07:26.24822258 +0000 UTC m=+882.286744472" Nov 22 03:07:26 crc kubenswrapper[4922]: I1122 03:07:26.327977 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" podStartSLOduration=3.795947715 podStartE2EDuration="31.327950846s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.2072373 +0000 UTC m=+853.245759192" lastFinishedPulling="2025-11-22 03:07:24.739240431 +0000 UTC m=+880.777762323" observedRunningTime="2025-11-22 03:07:26.289608475 +0000 UTC m=+882.328130467" watchObservedRunningTime="2025-11-22 03:07:26.327950846 +0000 UTC m=+882.366472738" Nov 22 03:07:29 crc kubenswrapper[4922]: I1122 03:07:29.304826 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:07:29 crc kubenswrapper[4922]: I1122 03:07:29.328736 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" podStartSLOduration=6.861640972 podStartE2EDuration="34.328716032s" podCreationTimestamp="2025-11-22 03:06:55 +0000 UTC" firstStartedPulling="2025-11-22 03:06:57.2629925 +0000 UTC m=+853.301514392" lastFinishedPulling="2025-11-22 03:07:24.73006756 +0000 UTC m=+880.768589452" observedRunningTime="2025-11-22 03:07:26.333020129 +0000 UTC m=+882.371542021" watchObservedRunningTime="2025-11-22 03:07:29.328716032 +0000 UTC m=+885.367237934" Nov 22 03:07:34 crc kubenswrapper[4922]: I1122 03:07:34.191342 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:34 crc kubenswrapper[4922]: I1122 03:07:34.192414 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:34 crc kubenswrapper[4922]: I1122 03:07:34.277833 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:35 crc kubenswrapper[4922]: I1122 03:07:35.221792 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:35 crc kubenswrapper[4922]: I1122 03:07:35.528412 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldtxd"] Nov 22 03:07:35 crc kubenswrapper[4922]: I1122 03:07:35.934193 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:35 crc kubenswrapper[4922]: I1122 03:07:35.934270 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:35 crc kubenswrapper[4922]: I1122 03:07:35.988389 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-867d87977b-5fntg" Nov 22 03:07:36 crc kubenswrapper[4922]: I1122 03:07:36.004287 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:36 crc kubenswrapper[4922]: I1122 03:07:36.048925 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58487d9bf4-tg4ph" Nov 22 03:07:36 crc kubenswrapper[4922]: I1122 03:07:36.126956 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b56b8849f-m5xsr" Nov 22 03:07:36 crc kubenswrapper[4922]: I1122 03:07:36.236862 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:36 crc kubenswrapper[4922]: I1122 03:07:36.418395 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-dsmmt" Nov 22 03:07:37 crc kubenswrapper[4922]: I1122 03:07:37.212304 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ldtxd" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="registry-server" containerID="cri-o://ab85f9eedd0d05d5a5ac721422cd24cf9c8df61db55a4cba6bc2cb6bfc3c2a68" gracePeriod=2 Nov 22 03:07:37 crc kubenswrapper[4922]: I1122 03:07:37.923990 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ftvr"] Nov 22 03:07:38 crc kubenswrapper[4922]: I1122 03:07:38.221560 4922 generic.go:334] "Generic (PLEG): container finished" podID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerID="ab85f9eedd0d05d5a5ac721422cd24cf9c8df61db55a4cba6bc2cb6bfc3c2a68" exitCode=0 Nov 22 03:07:38 crc kubenswrapper[4922]: I1122 03:07:38.221772 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5ftvr" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="registry-server" containerID="cri-o://c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725" gracePeriod=2 Nov 22 03:07:38 crc kubenswrapper[4922]: I1122 03:07:38.222018 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerDied","Data":"ab85f9eedd0d05d5a5ac721422cd24cf9c8df61db55a4cba6bc2cb6bfc3c2a68"} Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.027442 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.030594 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.127408 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-utilities\") pod \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.127733 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-catalog-content\") pod \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.127814 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pzzq\" (UniqueName: \"kubernetes.io/projected/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-kube-api-access-4pzzq\") pod \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\" (UID: \"d3748cc7-fc66-4422-b941-8ac33d7a1ac0\") " Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.127920 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-utilities\") pod \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.127999 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmjjh\" (UniqueName: \"kubernetes.io/projected/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-kube-api-access-lmjjh\") pod \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.128135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-catalog-content\") pod \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\" (UID: \"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9\") " Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.128435 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-utilities" (OuterVolumeSpecName: "utilities") pod "d3748cc7-fc66-4422-b941-8ac33d7a1ac0" (UID: "d3748cc7-fc66-4422-b941-8ac33d7a1ac0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.129169 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-utilities" (OuterVolumeSpecName: "utilities") pod "ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" (UID: "ecb528f1-c8bc-443c-bc6a-67444a6d4dc9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.137073 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-kube-api-access-lmjjh" (OuterVolumeSpecName: "kube-api-access-lmjjh") pod "ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" (UID: "ecb528f1-c8bc-443c-bc6a-67444a6d4dc9"). InnerVolumeSpecName "kube-api-access-lmjjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.140982 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-kube-api-access-4pzzq" (OuterVolumeSpecName: "kube-api-access-4pzzq") pod "d3748cc7-fc66-4422-b941-8ac33d7a1ac0" (UID: "d3748cc7-fc66-4422-b941-8ac33d7a1ac0"). InnerVolumeSpecName "kube-api-access-4pzzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.215082 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" (UID: "ecb528f1-c8bc-443c-bc6a-67444a6d4dc9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.234588 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.234806 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.234877 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pzzq\" (UniqueName: \"kubernetes.io/projected/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-kube-api-access-4pzzq\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.234966 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.235022 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmjjh\" (UniqueName: \"kubernetes.io/projected/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9-kube-api-access-lmjjh\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.242792 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ldtxd" event={"ID":"d3748cc7-fc66-4422-b941-8ac33d7a1ac0","Type":"ContainerDied","Data":"dc81d9981ab0261db7c43c5b789ab323897880d8ee51009c0712f010fff1440f"} Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.242968 4922 scope.go:117] "RemoveContainer" containerID="ab85f9eedd0d05d5a5ac721422cd24cf9c8df61db55a4cba6bc2cb6bfc3c2a68" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.242836 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ldtxd" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.246704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" event={"ID":"8b206938-2e76-40b1-b39c-ff333430e8f6","Type":"ContainerStarted","Data":"e88256644c4fac3674254fa01db694f3657a35a8dd56a29cdf5bbd0d9179df23"} Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.247417 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.249140 4922 generic.go:334] "Generic (PLEG): container finished" podID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerID="c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725" exitCode=0 Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.249219 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerDied","Data":"c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725"} Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.249291 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5ftvr" event={"ID":"ecb528f1-c8bc-443c-bc6a-67444a6d4dc9","Type":"ContainerDied","Data":"6bf33224b8b934211faacdae34e729ba9cd0fda37d650a6f3eb072cd580cd63a"} Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.249385 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5ftvr" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.274818 4922 scope.go:117] "RemoveContainer" containerID="7bd3e83e19836025108883495f80927bed0c80e289866e3710b4a94887d4aff4" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.277431 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" podStartSLOduration=2.771979148 podStartE2EDuration="45.277374286s" podCreationTimestamp="2025-11-22 03:06:54 +0000 UTC" firstStartedPulling="2025-11-22 03:06:56.102629549 +0000 UTC m=+852.141151441" lastFinishedPulling="2025-11-22 03:07:38.608024647 +0000 UTC m=+894.646546579" observedRunningTime="2025-11-22 03:07:39.271690629 +0000 UTC m=+895.310212521" watchObservedRunningTime="2025-11-22 03:07:39.277374286 +0000 UTC m=+895.315896178" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.288742 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5ftvr"] Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.292385 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3748cc7-fc66-4422-b941-8ac33d7a1ac0" (UID: "d3748cc7-fc66-4422-b941-8ac33d7a1ac0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.293004 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5ftvr"] Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.306699 4922 scope.go:117] "RemoveContainer" containerID="f957cae6b1fcf26b04ab8925f69df8f7321aa1a4d833e4182644d791aae56e4f" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.319504 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" path="/var/lib/kubelet/pods/ecb528f1-c8bc-443c-bc6a-67444a6d4dc9/volumes" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.327651 4922 scope.go:117] "RemoveContainer" containerID="c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.336050 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3748cc7-fc66-4422-b941-8ac33d7a1ac0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.343523 4922 scope.go:117] "RemoveContainer" containerID="b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.360291 4922 scope.go:117] "RemoveContainer" containerID="7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.372716 4922 scope.go:117] "RemoveContainer" containerID="c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725" Nov 22 03:07:39 crc kubenswrapper[4922]: E1122 03:07:39.373037 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725\": container with ID starting with c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725 not found: ID does not exist" containerID="c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.373067 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725"} err="failed to get container status \"c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725\": rpc error: code = NotFound desc = could not find container \"c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725\": container with ID starting with c46618467d8f88cecfdcea0a037674f1ccb5a49dd6539c4eb1f001e84ae8c725 not found: ID does not exist" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.373090 4922 scope.go:117] "RemoveContainer" containerID="b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0" Nov 22 03:07:39 crc kubenswrapper[4922]: E1122 03:07:39.373398 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0\": container with ID starting with b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0 not found: ID does not exist" containerID="b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.373420 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0"} err="failed to get container status \"b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0\": rpc error: code = NotFound desc = could not find container \"b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0\": container with ID starting with b754b670de8c84991a8168920ca0c9c3b5562a66818f69a99363e6bda0e235f0 not found: ID does not exist" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.373432 4922 scope.go:117] "RemoveContainer" containerID="7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd" Nov 22 03:07:39 crc kubenswrapper[4922]: E1122 03:07:39.373670 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd\": container with ID starting with 7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd not found: ID does not exist" containerID="7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.373698 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd"} err="failed to get container status \"7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd\": rpc error: code = NotFound desc = could not find container \"7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd\": container with ID starting with 7f02b4cbf9444fce94a5d19e3ba78f73c849b34d8a7805e570c0dfcb937b32dd not found: ID does not exist" Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.562805 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ldtxd"] Nov 22 03:07:39 crc kubenswrapper[4922]: I1122 03:07:39.576503 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ldtxd"] Nov 22 03:07:41 crc kubenswrapper[4922]: I1122 03:07:41.110344 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:07:41 crc kubenswrapper[4922]: I1122 03:07:41.111766 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:07:41 crc kubenswrapper[4922]: I1122 03:07:41.311739 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" path="/var/lib/kubelet/pods/d3748cc7-fc66-4422-b941-8ac33d7a1ac0/volumes" Nov 22 03:07:45 crc kubenswrapper[4922]: I1122 03:07:45.209515 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-864d88ccf8-vc6p7" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.930316 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-krhjv"] Nov 22 03:08:04 crc kubenswrapper[4922]: E1122 03:08:04.931091 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="extract-utilities" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931102 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="extract-utilities" Nov 22 03:08:04 crc kubenswrapper[4922]: E1122 03:08:04.931135 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="extract-content" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931141 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="extract-content" Nov 22 03:08:04 crc kubenswrapper[4922]: E1122 03:08:04.931157 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="registry-server" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931163 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="registry-server" Nov 22 03:08:04 crc kubenswrapper[4922]: E1122 03:08:04.931183 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="extract-utilities" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931188 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="extract-utilities" Nov 22 03:08:04 crc kubenswrapper[4922]: E1122 03:08:04.931196 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="registry-server" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931202 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="registry-server" Nov 22 03:08:04 crc kubenswrapper[4922]: E1122 03:08:04.931212 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="extract-content" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931218 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="extract-content" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931351 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3748cc7-fc66-4422-b941-8ac33d7a1ac0" containerName="registry-server" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.931366 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb528f1-c8bc-443c-bc6a-67444a6d4dc9" containerName="registry-server" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.932093 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.936315 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.936375 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-k8wdc" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.936566 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.936733 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 22 03:08:04 crc kubenswrapper[4922]: I1122 03:08:04.950732 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-krhjv"] Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.003662 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j2gpf"] Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.004945 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.008202 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.018201 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j2gpf"] Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.046784 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85q8v\" (UniqueName: \"kubernetes.io/projected/332aa71d-bbad-4127-982b-6416679db2a5-kube-api-access-85q8v\") pod \"dnsmasq-dns-675f4bcbfc-krhjv\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.046890 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rffzz\" (UniqueName: \"kubernetes.io/projected/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-kube-api-access-rffzz\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.046936 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.047121 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332aa71d-bbad-4127-982b-6416679db2a5-config\") pod \"dnsmasq-dns-675f4bcbfc-krhjv\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.047173 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-config\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.148436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rffzz\" (UniqueName: \"kubernetes.io/projected/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-kube-api-access-rffzz\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.148496 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.148548 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332aa71d-bbad-4127-982b-6416679db2a5-config\") pod \"dnsmasq-dns-675f4bcbfc-krhjv\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.148565 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-config\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.148607 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85q8v\" (UniqueName: \"kubernetes.io/projected/332aa71d-bbad-4127-982b-6416679db2a5-kube-api-access-85q8v\") pod \"dnsmasq-dns-675f4bcbfc-krhjv\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.149457 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-config\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.149458 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.149659 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332aa71d-bbad-4127-982b-6416679db2a5-config\") pod \"dnsmasq-dns-675f4bcbfc-krhjv\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.167642 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rffzz\" (UniqueName: \"kubernetes.io/projected/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-kube-api-access-rffzz\") pod \"dnsmasq-dns-78dd6ddcc-j2gpf\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.168148 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85q8v\" (UniqueName: \"kubernetes.io/projected/332aa71d-bbad-4127-982b-6416679db2a5-kube-api-access-85q8v\") pod \"dnsmasq-dns-675f4bcbfc-krhjv\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.251912 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.318688 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.734625 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-krhjv"] Nov 22 03:08:05 crc kubenswrapper[4922]: I1122 03:08:05.829501 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j2gpf"] Nov 22 03:08:05 crc kubenswrapper[4922]: W1122 03:08:05.831173 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaf1fe2f_594d_4879_8fa5_3eb4450a8ff0.slice/crio-45da9576e15a35b66e0dcaed880d8f5e9abe3ff72f754ffcc5edc68029d405c6 WatchSource:0}: Error finding container 45da9576e15a35b66e0dcaed880d8f5e9abe3ff72f754ffcc5edc68029d405c6: Status 404 returned error can't find the container with id 45da9576e15a35b66e0dcaed880d8f5e9abe3ff72f754ffcc5edc68029d405c6 Nov 22 03:08:06 crc kubenswrapper[4922]: I1122 03:08:06.566814 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" event={"ID":"332aa71d-bbad-4127-982b-6416679db2a5","Type":"ContainerStarted","Data":"baaa458a6be66ca45e0b9784abe521436c9ad90926b1adba75ea5210aa154e5e"} Nov 22 03:08:06 crc kubenswrapper[4922]: I1122 03:08:06.569028 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" event={"ID":"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0","Type":"ContainerStarted","Data":"45da9576e15a35b66e0dcaed880d8f5e9abe3ff72f754ffcc5edc68029d405c6"} Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.165589 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-krhjv"] Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.192563 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-94wtw"] Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.194018 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.206978 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-94wtw"] Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.300656 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-dns-svc\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.300898 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-config\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.300927 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5fjc\" (UniqueName: \"kubernetes.io/projected/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-kube-api-access-k5fjc\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.386434 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j2gpf"] Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.402584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-dns-svc\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.402644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-config\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.402661 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5fjc\" (UniqueName: \"kubernetes.io/projected/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-kube-api-access-k5fjc\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.405033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-dns-svc\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.405196 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-config\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.420985 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wg4v5"] Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.423483 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.442279 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wg4v5"] Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.461495 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5fjc\" (UniqueName: \"kubernetes.io/projected/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-kube-api-access-k5fjc\") pod \"dnsmasq-dns-666b6646f7-94wtw\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.506537 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.506630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxqwz\" (UniqueName: \"kubernetes.io/projected/e642b573-6e0b-4897-b1c8-1244683b0b73-kube-api-access-dxqwz\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.506652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-config\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.516152 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.607563 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxqwz\" (UniqueName: \"kubernetes.io/projected/e642b573-6e0b-4897-b1c8-1244683b0b73-kube-api-access-dxqwz\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.607616 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-config\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.607674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.608452 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-config\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.608937 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.633610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxqwz\" (UniqueName: \"kubernetes.io/projected/e642b573-6e0b-4897-b1c8-1244683b0b73-kube-api-access-dxqwz\") pod \"dnsmasq-dns-57d769cc4f-wg4v5\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:08 crc kubenswrapper[4922]: I1122 03:08:08.776502 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.342145 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.348109 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.351420 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.351758 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.352381 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-28fkg" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.352592 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.352719 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.352817 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.353339 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.358822 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433503 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-config-data\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433556 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjtd8\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-kube-api-access-wjtd8\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433594 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433613 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433631 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/161bd1ea-2276-4a95-b0ad-304cc807d13f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433762 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433807 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.433832 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/161bd1ea-2276-4a95-b0ad-304cc807d13f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.536570 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.536636 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.536660 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/161bd1ea-2276-4a95-b0ad-304cc807d13f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.536700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-config-data\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.536720 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjtd8\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-kube-api-access-wjtd8\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.536737 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537056 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537101 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537126 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/161bd1ea-2276-4a95-b0ad-304cc807d13f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537178 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537539 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537584 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.537602 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.538181 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-config-data\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.538376 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.538407 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.541809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/161bd1ea-2276-4a95-b0ad-304cc807d13f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.542516 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.543694 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/161bd1ea-2276-4a95-b0ad-304cc807d13f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.547635 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.557655 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.560762 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.565975 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.566857 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.566967 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.567089 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lm5pq" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.567174 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.567273 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.567379 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.568002 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjtd8\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-kube-api-access-wjtd8\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.572059 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.587825 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.642830 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg82p\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-kube-api-access-xg82p\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.642943 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.642979 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643018 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643042 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643071 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643100 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643194 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.643253 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.706756 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744427 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744490 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744510 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744584 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg82p\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-kube-api-access-xg82p\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744603 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744658 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744675 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.744695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.745029 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.746224 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.746563 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.747776 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.747809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.747962 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.748044 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.749173 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.749484 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.752752 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.766713 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg82p\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-kube-api-access-xg82p\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.772871 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:09 crc kubenswrapper[4922]: I1122 03:08:09.937221 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.976713 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.979423 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.984226 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.985874 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.986028 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.986210 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mj4rc" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.986664 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.986959 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 22 03:08:10 crc kubenswrapper[4922]: I1122 03:08:10.995501 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.065985 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.066302 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-kolla-config\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.066464 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-secrets\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.066572 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.066673 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm8mz\" (UniqueName: \"kubernetes.io/projected/674794c8-5fae-461e-91a1-f3f44a088e55-kube-api-access-tm8mz\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.066791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.066930 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/674794c8-5fae-461e-91a1-f3f44a088e55-config-data-generated\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.067027 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-operator-scripts\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.067159 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-config-data-default\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.109259 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.109328 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.109376 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.110033 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e544a691b867a653faf01710d7aaa2e43b953be45a75dda707c262af6da21812"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.110088 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://e544a691b867a653faf01710d7aaa2e43b953be45a75dda707c262af6da21812" gracePeriod=600 Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168128 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168191 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-kolla-config\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168232 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-secrets\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168252 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168272 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm8mz\" (UniqueName: \"kubernetes.io/projected/674794c8-5fae-461e-91a1-f3f44a088e55-kube-api-access-tm8mz\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168310 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/674794c8-5fae-461e-91a1-f3f44a088e55-config-data-generated\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168326 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-operator-scripts\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168368 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-config-data-default\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.168930 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-kolla-config\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.169184 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-config-data-default\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.169378 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.169741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/674794c8-5fae-461e-91a1-f3f44a088e55-config-data-generated\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.171241 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674794c8-5fae-461e-91a1-f3f44a088e55-operator-scripts\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.174590 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.176574 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.195342 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/674794c8-5fae-461e-91a1-f3f44a088e55-secrets\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.195493 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm8mz\" (UniqueName: \"kubernetes.io/projected/674794c8-5fae-461e-91a1-f3f44a088e55-kube-api-access-tm8mz\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.219788 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"674794c8-5fae-461e-91a1-f3f44a088e55\") " pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.299712 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.648499 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="e544a691b867a653faf01710d7aaa2e43b953be45a75dda707c262af6da21812" exitCode=0 Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.648793 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"e544a691b867a653faf01710d7aaa2e43b953be45a75dda707c262af6da21812"} Nov 22 03:08:11 crc kubenswrapper[4922]: I1122 03:08:11.648825 4922 scope.go:117] "RemoveContainer" containerID="3adf25f358b8b1181f3ac3e402fdb1299c491a8f833369cdc996bbafe94e841c" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.450811 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.452331 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.455653 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.456900 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.457307 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nd7qj" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.458283 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.465931 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592076 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592166 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592193 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592313 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592341 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592396 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh52v\" (UniqueName: \"kubernetes.io/projected/60c32179-6b0e-4e8b-a101-81ca49be2034-kube-api-access-zh52v\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592418 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592516 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/60c32179-6b0e-4e8b-a101-81ca49be2034-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.592541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694414 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694503 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694574 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694621 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694651 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh52v\" (UniqueName: \"kubernetes.io/projected/60c32179-6b0e-4e8b-a101-81ca49be2034-kube-api-access-zh52v\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694760 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.695475 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/60c32179-6b0e-4e8b-a101-81ca49be2034-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.694768 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/60c32179-6b0e-4e8b-a101-81ca49be2034-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.696240 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.696944 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.701356 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.703093 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/60c32179-6b0e-4e8b-a101-81ca49be2034-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.706391 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.707119 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/60c32179-6b0e-4e8b-a101-81ca49be2034-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.720074 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh52v\" (UniqueName: \"kubernetes.io/projected/60c32179-6b0e-4e8b-a101-81ca49be2034-kube-api-access-zh52v\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.754038 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"60c32179-6b0e-4e8b-a101-81ca49be2034\") " pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.775620 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.931121 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.932123 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.933675 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-st5cf" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.934651 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.949648 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 03:08:12 crc kubenswrapper[4922]: I1122 03:08:12.982234 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.104245 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4673860-e095-4170-92b7-cbd2ffdff114-config-data\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.104297 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4673860-e095-4170-92b7-cbd2ffdff114-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.104444 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4673860-e095-4170-92b7-cbd2ffdff114-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.104515 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c4673860-e095-4170-92b7-cbd2ffdff114-kolla-config\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.104562 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9sh2\" (UniqueName: \"kubernetes.io/projected/c4673860-e095-4170-92b7-cbd2ffdff114-kube-api-access-g9sh2\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.205864 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4673860-e095-4170-92b7-cbd2ffdff114-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.205910 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4673860-e095-4170-92b7-cbd2ffdff114-config-data\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.205979 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4673860-e095-4170-92b7-cbd2ffdff114-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.206012 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c4673860-e095-4170-92b7-cbd2ffdff114-kolla-config\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.206042 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9sh2\" (UniqueName: \"kubernetes.io/projected/c4673860-e095-4170-92b7-cbd2ffdff114-kube-api-access-g9sh2\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.207114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c4673860-e095-4170-92b7-cbd2ffdff114-config-data\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.207405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c4673860-e095-4170-92b7-cbd2ffdff114-kolla-config\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.211070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4673860-e095-4170-92b7-cbd2ffdff114-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.216974 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4673860-e095-4170-92b7-cbd2ffdff114-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.229830 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9sh2\" (UniqueName: \"kubernetes.io/projected/c4673860-e095-4170-92b7-cbd2ffdff114-kube-api-access-g9sh2\") pod \"memcached-0\" (UID: \"c4673860-e095-4170-92b7-cbd2ffdff114\") " pod="openstack/memcached-0" Nov 22 03:08:13 crc kubenswrapper[4922]: I1122 03:08:13.292067 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 03:08:14 crc kubenswrapper[4922]: I1122 03:08:14.750603 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:08:14 crc kubenswrapper[4922]: I1122 03:08:14.751739 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:08:14 crc kubenswrapper[4922]: I1122 03:08:14.753913 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l5rpr" Nov 22 03:08:14 crc kubenswrapper[4922]: I1122 03:08:14.776806 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:08:14 crc kubenswrapper[4922]: I1122 03:08:14.933228 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwrr6\" (UniqueName: \"kubernetes.io/projected/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b-kube-api-access-nwrr6\") pod \"kube-state-metrics-0\" (UID: \"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b\") " pod="openstack/kube-state-metrics-0" Nov 22 03:08:15 crc kubenswrapper[4922]: I1122 03:08:15.035683 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwrr6\" (UniqueName: \"kubernetes.io/projected/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b-kube-api-access-nwrr6\") pod \"kube-state-metrics-0\" (UID: \"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b\") " pod="openstack/kube-state-metrics-0" Nov 22 03:08:15 crc kubenswrapper[4922]: I1122 03:08:15.057911 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwrr6\" (UniqueName: \"kubernetes.io/projected/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b-kube-api-access-nwrr6\") pod \"kube-state-metrics-0\" (UID: \"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b\") " pod="openstack/kube-state-metrics-0" Nov 22 03:08:15 crc kubenswrapper[4922]: I1122 03:08:15.074236 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.355739 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nlzww"] Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.357722 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.361810 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-r8ms6" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.362356 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.362582 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.368975 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-4cpkr"] Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.372209 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.380916 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlzww"] Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.384912 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4cpkr"] Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491320 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-run\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491391 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqgd\" (UniqueName: \"kubernetes.io/projected/3882abf6-0110-46fd-b498-de1d56838fc8-kube-api-access-qcqgd\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491467 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3882abf6-0110-46fd-b498-de1d56838fc8-scripts\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491668 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c038335-42ee-4618-a14b-b32bc0f1d53a-combined-ca-bundle\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491751 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-run\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-log-ovn\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c038335-42ee-4618-a14b-b32bc0f1d53a-ovn-controller-tls-certs\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491937 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-etc-ovs\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491960 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xltrf\" (UniqueName: \"kubernetes.io/projected/5c038335-42ee-4618-a14b-b32bc0f1d53a-kube-api-access-xltrf\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.491995 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c038335-42ee-4618-a14b-b32bc0f1d53a-scripts\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.492126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-log\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.492167 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-lib\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.492328 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-run-ovn\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.593941 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-run-ovn\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594047 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-run\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594082 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqgd\" (UniqueName: \"kubernetes.io/projected/3882abf6-0110-46fd-b498-de1d56838fc8-kube-api-access-qcqgd\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594110 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3882abf6-0110-46fd-b498-de1d56838fc8-scripts\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594156 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c038335-42ee-4618-a14b-b32bc0f1d53a-combined-ca-bundle\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594179 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-run\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594212 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-log-ovn\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594236 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c038335-42ee-4618-a14b-b32bc0f1d53a-ovn-controller-tls-certs\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594261 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-etc-ovs\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594302 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xltrf\" (UniqueName: \"kubernetes.io/projected/5c038335-42ee-4618-a14b-b32bc0f1d53a-kube-api-access-xltrf\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594330 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c038335-42ee-4618-a14b-b32bc0f1d53a-scripts\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594353 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-log\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594378 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-lib\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594812 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-run-ovn\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.594976 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-lib\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.596707 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-run\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.596981 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-log-ovn\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.600741 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c038335-42ee-4618-a14b-b32bc0f1d53a-var-run\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.600771 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-etc-ovs\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.601063 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3882abf6-0110-46fd-b498-de1d56838fc8-var-log\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.603081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c038335-42ee-4618-a14b-b32bc0f1d53a-scripts\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.605137 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c038335-42ee-4618-a14b-b32bc0f1d53a-ovn-controller-tls-certs\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.605573 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c038335-42ee-4618-a14b-b32bc0f1d53a-combined-ca-bundle\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.629388 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xltrf\" (UniqueName: \"kubernetes.io/projected/5c038335-42ee-4618-a14b-b32bc0f1d53a-kube-api-access-xltrf\") pod \"ovn-controller-nlzww\" (UID: \"5c038335-42ee-4618-a14b-b32bc0f1d53a\") " pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.630644 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqgd\" (UniqueName: \"kubernetes.io/projected/3882abf6-0110-46fd-b498-de1d56838fc8-kube-api-access-qcqgd\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.683505 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3882abf6-0110-46fd-b498-de1d56838fc8-scripts\") pod \"ovn-controller-ovs-4cpkr\" (UID: \"3882abf6-0110-46fd-b498-de1d56838fc8\") " pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.697430 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlzww" Nov 22 03:08:18 crc kubenswrapper[4922]: I1122 03:08:18.711668 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.781812 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.794168 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.819644 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.820564 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.821266 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.821405 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.821659 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-q4l8x" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.849090 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939330 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb9d221-16e8-421b-b044-a416405d01c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939470 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939532 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb9d221-16e8-421b-b044-a416405d01c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939637 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939717 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939789 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cb9d221-16e8-421b-b044-a416405d01c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:20 crc kubenswrapper[4922]: I1122 03:08:20.939942 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk58n\" (UniqueName: \"kubernetes.io/projected/0cb9d221-16e8-421b-b044-a416405d01c1-kube-api-access-bk58n\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.043902 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cb9d221-16e8-421b-b044-a416405d01c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044080 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk58n\" (UniqueName: \"kubernetes.io/projected/0cb9d221-16e8-421b-b044-a416405d01c1-kube-api-access-bk58n\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044126 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb9d221-16e8-421b-b044-a416405d01c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044268 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044301 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb9d221-16e8-421b-b044-a416405d01c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044322 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044447 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.044438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cb9d221-16e8-421b-b044-a416405d01c1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.045481 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cb9d221-16e8-421b-b044-a416405d01c1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.045602 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.045772 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb9d221-16e8-421b-b044-a416405d01c1-config\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.053016 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.053066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.059557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cb9d221-16e8-421b-b044-a416405d01c1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.063353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk58n\" (UniqueName: \"kubernetes.io/projected/0cb9d221-16e8-421b-b044-a416405d01c1-kube-api-access-bk58n\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.069796 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cb9d221-16e8-421b-b044-a416405d01c1\") " pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:21 crc kubenswrapper[4922]: I1122 03:08:21.158374 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.279559 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.450358 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.460013 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.460522 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.462406 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.462648 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ds8pd" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.462994 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.463130 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.567642 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.567693 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a38fd3-5643-4795-8a88-f21d3ff7b43a-config\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.567725 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.567885 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67a38fd3-5643-4795-8a88-f21d3ff7b43a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.567968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4ldh\" (UniqueName: \"kubernetes.io/projected/67a38fd3-5643-4795-8a88-f21d3ff7b43a-kube-api-access-d4ldh\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.568041 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.568182 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67a38fd3-5643-4795-8a88-f21d3ff7b43a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.568238 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.669566 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.669884 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67a38fd3-5643-4795-8a88-f21d3ff7b43a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670011 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4ldh\" (UniqueName: \"kubernetes.io/projected/67a38fd3-5643-4795-8a88-f21d3ff7b43a-kube-api-access-d4ldh\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670097 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670195 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67a38fd3-5643-4795-8a88-f21d3ff7b43a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670281 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670382 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670472 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a38fd3-5643-4795-8a88-f21d3ff7b43a-config\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.670528 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.671166 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67a38fd3-5643-4795-8a88-f21d3ff7b43a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.671662 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a38fd3-5643-4795-8a88-f21d3ff7b43a-config\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.671878 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67a38fd3-5643-4795-8a88-f21d3ff7b43a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.678431 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.679512 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.685386 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a38fd3-5643-4795-8a88-f21d3ff7b43a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.690062 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4ldh\" (UniqueName: \"kubernetes.io/projected/67a38fd3-5643-4795-8a88-f21d3ff7b43a-kube-api-access-d4ldh\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.714577 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"67a38fd3-5643-4795-8a88-f21d3ff7b43a\") " pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:22 crc kubenswrapper[4922]: I1122 03:08:22.784709 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:23 crc kubenswrapper[4922]: E1122 03:08:23.105822 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 03:08:23 crc kubenswrapper[4922]: E1122 03:08:23.106025 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rffzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-j2gpf_openstack(baf1fe2f-594d-4879-8fa5-3eb4450a8ff0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:08:23 crc kubenswrapper[4922]: E1122 03:08:23.107317 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" podUID="baf1fe2f-594d-4879-8fa5-3eb4450a8ff0" Nov 22 03:08:23 crc kubenswrapper[4922]: I1122 03:08:23.534149 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:08:23 crc kubenswrapper[4922]: W1122 03:08:23.581547 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee5b2bc_9aaf_4bb7_b5b2_f4f14c5e004b.slice/crio-2b88195d9184ae974eeed55c465caf1f27c2b46473d626a13a5dbd1c1b28e66d WatchSource:0}: Error finding container 2b88195d9184ae974eeed55c465caf1f27c2b46473d626a13a5dbd1c1b28e66d: Status 404 returned error can't find the container with id 2b88195d9184ae974eeed55c465caf1f27c2b46473d626a13a5dbd1c1b28e66d Nov 22 03:08:23 crc kubenswrapper[4922]: I1122 03:08:23.792712 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b","Type":"ContainerStarted","Data":"2b88195d9184ae974eeed55c465caf1f27c2b46473d626a13a5dbd1c1b28e66d"} Nov 22 03:08:23 crc kubenswrapper[4922]: I1122 03:08:23.800054 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"161bd1ea-2276-4a95-b0ad-304cc807d13f","Type":"ContainerStarted","Data":"921d46bffde48af9b5e317203caf31711b2506c93ad522372d7afd13c5369d38"} Nov 22 03:08:23 crc kubenswrapper[4922]: I1122 03:08:23.899548 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 03:08:23 crc kubenswrapper[4922]: W1122 03:08:23.906970 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674794c8_5fae_461e_91a1_f3f44a088e55.slice/crio-bd95ccf81c4b0e1cf10beddd96271c1a024bd6176bcc0a797c15e2fbdb29ecaf WatchSource:0}: Error finding container bd95ccf81c4b0e1cf10beddd96271c1a024bd6176bcc0a797c15e2fbdb29ecaf: Status 404 returned error can't find the container with id bd95ccf81c4b0e1cf10beddd96271c1a024bd6176bcc0a797c15e2fbdb29ecaf Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.256378 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.275743 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-94wtw"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.284047 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.305217 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.315991 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wg4v5"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.319716 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.339031 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlzww"] Nov 22 03:08:24 crc kubenswrapper[4922]: W1122 03:08:24.350352 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4673860_e095_4170_92b7_cbd2ffdff114.slice/crio-32c5647a4a73f9db9c268e659db17277bc2b99be4841749428c0e884e3725e34 WatchSource:0}: Error finding container 32c5647a4a73f9db9c268e659db17277bc2b99be4841749428c0e884e3725e34: Status 404 returned error can't find the container with id 32c5647a4a73f9db9c268e659db17277bc2b99be4841749428c0e884e3725e34 Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.410628 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-config\") pod \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.410749 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-dns-svc\") pod \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.410818 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rffzz\" (UniqueName: \"kubernetes.io/projected/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-kube-api-access-rffzz\") pod \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\" (UID: \"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0\") " Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.411227 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.411410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-config" (OuterVolumeSpecName: "config") pod "baf1fe2f-594d-4879-8fa5-3eb4450a8ff0" (UID: "baf1fe2f-594d-4879-8fa5-3eb4450a8ff0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.411427 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "baf1fe2f-594d-4879-8fa5-3eb4450a8ff0" (UID: "baf1fe2f-594d-4879-8fa5-3eb4450a8ff0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.418013 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-kube-api-access-rffzz" (OuterVolumeSpecName: "kube-api-access-rffzz") pod "baf1fe2f-594d-4879-8fa5-3eb4450a8ff0" (UID: "baf1fe2f-594d-4879-8fa5-3eb4450a8ff0"). InnerVolumeSpecName "kube-api-access-rffzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.513780 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rffzz\" (UniqueName: \"kubernetes.io/projected/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-kube-api-access-rffzz\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.513819 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.513831 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.520236 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-4cpkr"] Nov 22 03:08:24 crc kubenswrapper[4922]: W1122 03:08:24.524162 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3882abf6_0110_46fd_b498_de1d56838fc8.slice/crio-3ae9cb73eea222047897e9ab0bc54197ad151cf4cfd71b1838a7daca0eb093da WatchSource:0}: Error finding container 3ae9cb73eea222047897e9ab0bc54197ad151cf4cfd71b1838a7daca0eb093da: Status 404 returned error can't find the container with id 3ae9cb73eea222047897e9ab0bc54197ad151cf4cfd71b1838a7daca0eb093da Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.809626 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11bc873c-bbc5-4033-ad7a-9569c2b6aa76","Type":"ContainerStarted","Data":"a878f53d0730a22e8676b2e0cecbd3e22ff2485d0786027ab83da96b9d2a037b"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.811630 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"674794c8-5fae-461e-91a1-f3f44a088e55","Type":"ContainerStarted","Data":"bd95ccf81c4b0e1cf10beddd96271c1a024bd6176bcc0a797c15e2fbdb29ecaf"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.813481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c4673860-e095-4170-92b7-cbd2ffdff114","Type":"ContainerStarted","Data":"32c5647a4a73f9db9c268e659db17277bc2b99be4841749428c0e884e3725e34"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.815026 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" event={"ID":"baf1fe2f-594d-4879-8fa5-3eb4450a8ff0","Type":"ContainerDied","Data":"45da9576e15a35b66e0dcaed880d8f5e9abe3ff72f754ffcc5edc68029d405c6"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.815181 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-j2gpf" Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.816977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4cpkr" event={"ID":"3882abf6-0110-46fd-b498-de1d56838fc8","Type":"ContainerStarted","Data":"3ae9cb73eea222047897e9ab0bc54197ad151cf4cfd71b1838a7daca0eb093da"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.818789 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" event={"ID":"e642b573-6e0b-4897-b1c8-1244683b0b73","Type":"ContainerStarted","Data":"992c93ba37b37f1496910a749de78af1070c12477072152eea6c606617cd5948"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.820332 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cb9d221-16e8-421b-b044-a416405d01c1","Type":"ContainerStarted","Data":"5606426ab7536eb01a44b3c99ef27c7b34ccf45bba6b177848d389e81a0ae5e3"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.822101 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"60c32179-6b0e-4e8b-a101-81ca49be2034","Type":"ContainerStarted","Data":"bc57500b839cc916ad0ba4a196e879d8aa415de9e194a29b2b7328d4d8429cfd"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.823353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlzww" event={"ID":"5c038335-42ee-4618-a14b-b32bc0f1d53a","Type":"ContainerStarted","Data":"19c116bafa7dd6b1c80678a3a96af2022c0e061de4432f61d98a13f176caf14c"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.824423 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" event={"ID":"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020","Type":"ContainerStarted","Data":"acd9339a81d7efe4b1f7892fb6737bf7d2e9e58bf7e14d5184f6c4099e085f7e"} Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.895036 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j2gpf"] Nov 22 03:08:24 crc kubenswrapper[4922]: I1122 03:08:24.906255 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-j2gpf"] Nov 22 03:08:25 crc kubenswrapper[4922]: I1122 03:08:25.069518 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 03:08:25 crc kubenswrapper[4922]: E1122 03:08:25.132345 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 03:08:25 crc kubenswrapper[4922]: E1122 03:08:25.132909 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85q8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-krhjv_openstack(332aa71d-bbad-4127-982b-6416679db2a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:08:25 crc kubenswrapper[4922]: E1122 03:08:25.134323 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" podUID="332aa71d-bbad-4127-982b-6416679db2a5" Nov 22 03:08:25 crc kubenswrapper[4922]: I1122 03:08:25.321876 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf1fe2f-594d-4879-8fa5-3eb4450a8ff0" path="/var/lib/kubelet/pods/baf1fe2f-594d-4879-8fa5-3eb4450a8ff0/volumes" Nov 22 03:08:25 crc kubenswrapper[4922]: I1122 03:08:25.852376 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"a719d1a6509057cf085f1c2768ec28cb47fdbdd817caffc2f3d5d452e6b5e16a"} Nov 22 03:08:25 crc kubenswrapper[4922]: I1122 03:08:25.855977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67a38fd3-5643-4795-8a88-f21d3ff7b43a","Type":"ContainerStarted","Data":"d35de7b6d4d69c101443054aeca43312cdb0347f2142bd862bd66cc9a6a20e59"} Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.130967 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.252067 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85q8v\" (UniqueName: \"kubernetes.io/projected/332aa71d-bbad-4127-982b-6416679db2a5-kube-api-access-85q8v\") pod \"332aa71d-bbad-4127-982b-6416679db2a5\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.252292 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332aa71d-bbad-4127-982b-6416679db2a5-config\") pod \"332aa71d-bbad-4127-982b-6416679db2a5\" (UID: \"332aa71d-bbad-4127-982b-6416679db2a5\") " Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.252728 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/332aa71d-bbad-4127-982b-6416679db2a5-config" (OuterVolumeSpecName: "config") pod "332aa71d-bbad-4127-982b-6416679db2a5" (UID: "332aa71d-bbad-4127-982b-6416679db2a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.253777 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332aa71d-bbad-4127-982b-6416679db2a5-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.256324 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332aa71d-bbad-4127-982b-6416679db2a5-kube-api-access-85q8v" (OuterVolumeSpecName: "kube-api-access-85q8v") pod "332aa71d-bbad-4127-982b-6416679db2a5" (UID: "332aa71d-bbad-4127-982b-6416679db2a5"). InnerVolumeSpecName "kube-api-access-85q8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.355068 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85q8v\" (UniqueName: \"kubernetes.io/projected/332aa71d-bbad-4127-982b-6416679db2a5-kube-api-access-85q8v\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.865551 4922 generic.go:334] "Generic (PLEG): container finished" podID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerID="c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4" exitCode=0 Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.865654 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" event={"ID":"e642b573-6e0b-4897-b1c8-1244683b0b73","Type":"ContainerDied","Data":"c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4"} Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.868277 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" event={"ID":"332aa71d-bbad-4127-982b-6416679db2a5","Type":"ContainerDied","Data":"baaa458a6be66ca45e0b9784abe521436c9ad90926b1adba75ea5210aa154e5e"} Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.868314 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-krhjv" Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.869789 4922 generic.go:334] "Generic (PLEG): container finished" podID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerID="3118a65327d78da8bd65c0dcc72e91853521a329e858453fdf9f1daf4b7523af" exitCode=0 Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.869973 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" event={"ID":"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020","Type":"ContainerDied","Data":"3118a65327d78da8bd65c0dcc72e91853521a329e858453fdf9f1daf4b7523af"} Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.970745 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-krhjv"] Nov 22 03:08:26 crc kubenswrapper[4922]: I1122 03:08:26.980313 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-krhjv"] Nov 22 03:08:27 crc kubenswrapper[4922]: I1122 03:08:27.312090 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332aa71d-bbad-4127-982b-6416679db2a5" path="/var/lib/kubelet/pods/332aa71d-bbad-4127-982b-6416679db2a5/volumes" Nov 22 03:08:34 crc kubenswrapper[4922]: I1122 03:08:34.948280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" event={"ID":"e642b573-6e0b-4897-b1c8-1244683b0b73","Type":"ContainerStarted","Data":"e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2"} Nov 22 03:08:34 crc kubenswrapper[4922]: I1122 03:08:34.948860 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:34 crc kubenswrapper[4922]: I1122 03:08:34.950961 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"674794c8-5fae-461e-91a1-f3f44a088e55","Type":"ContainerStarted","Data":"eead7b71b430c178fec85fdb47f627aadfd67b63794e578806434881e5804991"} Nov 22 03:08:34 crc kubenswrapper[4922]: I1122 03:08:34.966475 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" podStartSLOduration=25.582317266 podStartE2EDuration="26.966457421s" podCreationTimestamp="2025-11-22 03:08:08 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.327641401 +0000 UTC m=+940.366163303" lastFinishedPulling="2025-11-22 03:08:25.711781566 +0000 UTC m=+941.750303458" observedRunningTime="2025-11-22 03:08:34.965698433 +0000 UTC m=+951.004220325" watchObservedRunningTime="2025-11-22 03:08:34.966457421 +0000 UTC m=+951.004979313" Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.961172 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67a38fd3-5643-4795-8a88-f21d3ff7b43a","Type":"ContainerStarted","Data":"b76a562908a388523672514c1ae3d314904ba66012bbf4915c21cc302a07a2af"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.963091 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cb9d221-16e8-421b-b044-a416405d01c1","Type":"ContainerStarted","Data":"7dabee2d790787306640b0e0abc648f907936d0d5948283f82395f01bf01126d"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.964617 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"60c32179-6b0e-4e8b-a101-81ca49be2034","Type":"ContainerStarted","Data":"93b276ce9db9c2e84ff352042cd199616dc6c427fa24c3c6403edb5b54a2b0c0"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.971284 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" event={"ID":"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020","Type":"ContainerStarted","Data":"5ee6c1256e34c4706b359e48ba3d7acd1a7c21b3019e68947071dd942f8152a5"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.972021 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.974077 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c4673860-e095-4170-92b7-cbd2ffdff114","Type":"ContainerStarted","Data":"94e434788efe5007aa1c54687608e8c8fa7c0dca679b08bcb8ab636614bea897"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.974544 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.977178 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlzww" event={"ID":"5c038335-42ee-4618-a14b-b32bc0f1d53a","Type":"ContainerStarted","Data":"a0cea9b7f83d3d1521a0ee9972b570bfea35531b9f28e679405ed33947a95aa9"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.977957 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nlzww" Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.981072 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b","Type":"ContainerStarted","Data":"5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e"} Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.981149 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 03:08:35 crc kubenswrapper[4922]: I1122 03:08:35.983560 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4cpkr" event={"ID":"3882abf6-0110-46fd-b498-de1d56838fc8","Type":"ContainerStarted","Data":"12ffab05e4fc9a96a6997bd4c31626d7c2149a3ca03af71e75e11bde31f1ea23"} Nov 22 03:08:36 crc kubenswrapper[4922]: I1122 03:08:36.011012 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.111855402 podStartE2EDuration="24.010990676s" podCreationTimestamp="2025-11-22 03:08:12 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.361195537 +0000 UTC m=+940.399717429" lastFinishedPulling="2025-11-22 03:08:33.260330801 +0000 UTC m=+949.298852703" observedRunningTime="2025-11-22 03:08:36.007675386 +0000 UTC m=+952.046197288" watchObservedRunningTime="2025-11-22 03:08:36.010990676 +0000 UTC m=+952.049512568" Nov 22 03:08:36 crc kubenswrapper[4922]: I1122 03:08:36.031269 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.851321592 podStartE2EDuration="22.031250372s" podCreationTimestamp="2025-11-22 03:08:14 +0000 UTC" firstStartedPulling="2025-11-22 03:08:23.586155847 +0000 UTC m=+939.624677729" lastFinishedPulling="2025-11-22 03:08:34.766084607 +0000 UTC m=+950.804606509" observedRunningTime="2025-11-22 03:08:36.026028818 +0000 UTC m=+952.064550730" watchObservedRunningTime="2025-11-22 03:08:36.031250372 +0000 UTC m=+952.069772264" Nov 22 03:08:36 crc kubenswrapper[4922]: I1122 03:08:36.047963 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" podStartSLOduration=26.589029073 podStartE2EDuration="28.047947854s" podCreationTimestamp="2025-11-22 03:08:08 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.302983799 +0000 UTC m=+940.341505691" lastFinishedPulling="2025-11-22 03:08:25.76190258 +0000 UTC m=+941.800424472" observedRunningTime="2025-11-22 03:08:36.044315917 +0000 UTC m=+952.082837819" watchObservedRunningTime="2025-11-22 03:08:36.047947854 +0000 UTC m=+952.086469746" Nov 22 03:08:36 crc kubenswrapper[4922]: I1122 03:08:36.086146 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nlzww" podStartSLOduration=8.638186782 podStartE2EDuration="18.086120071s" podCreationTimestamp="2025-11-22 03:08:18 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.337125249 +0000 UTC m=+940.375647141" lastFinishedPulling="2025-11-22 03:08:33.785058528 +0000 UTC m=+949.823580430" observedRunningTime="2025-11-22 03:08:36.066750466 +0000 UTC m=+952.105272358" watchObservedRunningTime="2025-11-22 03:08:36.086120071 +0000 UTC m=+952.124641963" Nov 22 03:08:36 crc kubenswrapper[4922]: I1122 03:08:36.995635 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"161bd1ea-2276-4a95-b0ad-304cc807d13f","Type":"ContainerStarted","Data":"4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd"} Nov 22 03:08:36 crc kubenswrapper[4922]: I1122 03:08:36.998860 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11bc873c-bbc5-4033-ad7a-9569c2b6aa76","Type":"ContainerStarted","Data":"f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a"} Nov 22 03:08:37 crc kubenswrapper[4922]: I1122 03:08:37.001314 4922 generic.go:334] "Generic (PLEG): container finished" podID="3882abf6-0110-46fd-b498-de1d56838fc8" containerID="12ffab05e4fc9a96a6997bd4c31626d7c2149a3ca03af71e75e11bde31f1ea23" exitCode=0 Nov 22 03:08:37 crc kubenswrapper[4922]: I1122 03:08:37.001401 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4cpkr" event={"ID":"3882abf6-0110-46fd-b498-de1d56838fc8","Type":"ContainerDied","Data":"12ffab05e4fc9a96a6997bd4c31626d7c2149a3ca03af71e75e11bde31f1ea23"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.037308 4922 generic.go:334] "Generic (PLEG): container finished" podID="674794c8-5fae-461e-91a1-f3f44a088e55" containerID="eead7b71b430c178fec85fdb47f627aadfd67b63794e578806434881e5804991" exitCode=0 Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.037419 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"674794c8-5fae-461e-91a1-f3f44a088e55","Type":"ContainerDied","Data":"eead7b71b430c178fec85fdb47f627aadfd67b63794e578806434881e5804991"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.044157 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4cpkr" event={"ID":"3882abf6-0110-46fd-b498-de1d56838fc8","Type":"ContainerStarted","Data":"1ced51e3f78ea02e01cec1407192dd942f79a9123e8b33248140c78e906b68f5"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.044239 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-4cpkr" event={"ID":"3882abf6-0110-46fd-b498-de1d56838fc8","Type":"ContainerStarted","Data":"ca7023f92f5aabff01e62d5b1ecaad535ca55e2f48b10750868db1950d0284e3"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.044433 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.044677 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.048322 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67a38fd3-5643-4795-8a88-f21d3ff7b43a","Type":"ContainerStarted","Data":"4d2c3f9b8a0b2f2d726c4741a258e27b26bb18e9fefb5c53a4a25255dbc7b527"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.052440 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cb9d221-16e8-421b-b044-a416405d01c1","Type":"ContainerStarted","Data":"4022c26396730a39dcc3b3d9a7261caa45814a1b682fe5c26acca57527a5fd16"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.054630 4922 generic.go:334] "Generic (PLEG): container finished" podID="60c32179-6b0e-4e8b-a101-81ca49be2034" containerID="93b276ce9db9c2e84ff352042cd199616dc6c427fa24c3c6403edb5b54a2b0c0" exitCode=0 Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.054685 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"60c32179-6b0e-4e8b-a101-81ca49be2034","Type":"ContainerDied","Data":"93b276ce9db9c2e84ff352042cd199616dc6c427fa24c3c6403edb5b54a2b0c0"} Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.146594 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.678504076 podStartE2EDuration="21.146569724s" podCreationTimestamp="2025-11-22 03:08:19 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.413401612 +0000 UTC m=+940.451923504" lastFinishedPulling="2025-11-22 03:08:38.88146722 +0000 UTC m=+954.919989152" observedRunningTime="2025-11-22 03:08:40.146277847 +0000 UTC m=+956.184799749" watchObservedRunningTime="2025-11-22 03:08:40.146569724 +0000 UTC m=+956.185091626" Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.180647 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-4cpkr" podStartSLOduration=13.087854368 podStartE2EDuration="22.180623673s" podCreationTimestamp="2025-11-22 03:08:18 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.526298744 +0000 UTC m=+940.564820636" lastFinishedPulling="2025-11-22 03:08:33.619068019 +0000 UTC m=+949.657589941" observedRunningTime="2025-11-22 03:08:40.174528365 +0000 UTC m=+956.213050267" watchObservedRunningTime="2025-11-22 03:08:40.180623673 +0000 UTC m=+956.219145575" Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.203062 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.436936748 podStartE2EDuration="19.203042201s" podCreationTimestamp="2025-11-22 03:08:21 +0000 UTC" firstStartedPulling="2025-11-22 03:08:25.074148627 +0000 UTC m=+941.112670549" lastFinishedPulling="2025-11-22 03:08:38.84025411 +0000 UTC m=+954.878776002" observedRunningTime="2025-11-22 03:08:40.197729813 +0000 UTC m=+956.236251705" watchObservedRunningTime="2025-11-22 03:08:40.203042201 +0000 UTC m=+956.241564093" Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.785443 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:40 crc kubenswrapper[4922]: I1122 03:08:40.857755 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.068384 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"60c32179-6b0e-4e8b-a101-81ca49be2034","Type":"ContainerStarted","Data":"abfc94f5b58d00cbd9e6b1246be9c8d474489d7cd27e5ecc458acebeb9ea0e5e"} Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.071607 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"674794c8-5fae-461e-91a1-f3f44a088e55","Type":"ContainerStarted","Data":"a1420e72a65758ca7b0ffecbbffcc752c33d0e18e72e2b7c28494cdd974d9f2f"} Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.073512 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.113408 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.920800198 podStartE2EDuration="30.113384292s" podCreationTimestamp="2025-11-22 03:08:11 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.321726129 +0000 UTC m=+940.360248021" lastFinishedPulling="2025-11-22 03:08:33.514310213 +0000 UTC m=+949.552832115" observedRunningTime="2025-11-22 03:08:41.103357281 +0000 UTC m=+957.141879213" watchObservedRunningTime="2025-11-22 03:08:41.113384292 +0000 UTC m=+957.151906224" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.129216 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.138357 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.791480563 podStartE2EDuration="32.138336852s" podCreationTimestamp="2025-11-22 03:08:09 +0000 UTC" firstStartedPulling="2025-11-22 03:08:23.913489072 +0000 UTC m=+939.952010964" lastFinishedPulling="2025-11-22 03:08:33.260345311 +0000 UTC m=+949.298867253" observedRunningTime="2025-11-22 03:08:41.135988235 +0000 UTC m=+957.174510207" watchObservedRunningTime="2025-11-22 03:08:41.138336852 +0000 UTC m=+957.176858784" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.159701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.333635 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.333685 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.442724 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wg4v5"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.443010 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerName="dnsmasq-dns" containerID="cri-o://e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2" gracePeriod=10 Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.444075 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.511921 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-997qf"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.513334 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.515250 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.525000 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hdkg5"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.526050 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.529415 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.534084 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-997qf"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.542256 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hdkg5"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.631745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5043af03-d8a1-4437-9ab8-78907d742588-combined-ca-bundle\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5043af03-d8a1-4437-9ab8-78907d742588-ovn-rundir\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632072 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5043af03-d8a1-4437-9ab8-78907d742588-ovs-rundir\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632171 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-config\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632227 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5043af03-d8a1-4437-9ab8-78907d742588-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632287 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk2qp\" (UniqueName: \"kubernetes.io/projected/c3c54954-22ce-4674-857c-72d2dce55910-kube-api-access-sk2qp\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632308 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ch8\" (UniqueName: \"kubernetes.io/projected/5043af03-d8a1-4437-9ab8-78907d742588-kube-api-access-g6ch8\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632397 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5043af03-d8a1-4437-9ab8-78907d742588-config\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.632438 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.712001 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-94wtw"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.712218 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerName="dnsmasq-dns" containerID="cri-o://5ee6c1256e34c4706b359e48ba3d7acd1a7c21b3019e68947071dd942f8152a5" gracePeriod=10 Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.716045 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.728556 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l6qxq"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.730263 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.732385 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733381 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-config\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5043af03-d8a1-4437-9ab8-78907d742588-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733475 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk2qp\" (UniqueName: \"kubernetes.io/projected/c3c54954-22ce-4674-857c-72d2dce55910-kube-api-access-sk2qp\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733491 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ch8\" (UniqueName: \"kubernetes.io/projected/5043af03-d8a1-4437-9ab8-78907d742588-kube-api-access-g6ch8\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733528 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5043af03-d8a1-4437-9ab8-78907d742588-config\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733572 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5043af03-d8a1-4437-9ab8-78907d742588-combined-ca-bundle\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733608 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5043af03-d8a1-4437-9ab8-78907d742588-ovn-rundir\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.733624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5043af03-d8a1-4437-9ab8-78907d742588-ovs-rundir\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.734038 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/5043af03-d8a1-4437-9ab8-78907d742588-ovs-rundir\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.734932 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-config\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.735549 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.736636 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/5043af03-d8a1-4437-9ab8-78907d742588-ovn-rundir\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.737003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5043af03-d8a1-4437-9ab8-78907d742588-config\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.737912 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.754237 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5043af03-d8a1-4437-9ab8-78907d742588-combined-ca-bundle\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.754266 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5043af03-d8a1-4437-9ab8-78907d742588-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.755468 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l6qxq"] Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.775655 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk2qp\" (UniqueName: \"kubernetes.io/projected/c3c54954-22ce-4674-857c-72d2dce55910-kube-api-access-sk2qp\") pod \"dnsmasq-dns-7f896c8c65-997qf\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.778442 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ch8\" (UniqueName: \"kubernetes.io/projected/5043af03-d8a1-4437-9ab8-78907d742588-kube-api-access-g6ch8\") pod \"ovn-controller-metrics-hdkg5\" (UID: \"5043af03-d8a1-4437-9ab8-78907d742588\") " pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.835011 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-config\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.835093 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.835174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.835211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.835270 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/121927b8-f52a-4b01-89c1-85b1e694906a-kube-api-access-r658v\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.921052 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.938811 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hdkg5" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.947628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-config\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.947716 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.947979 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.948015 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.948101 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/121927b8-f52a-4b01-89c1-85b1e694906a-kube-api-access-r658v\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.951449 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-config\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.952082 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.952923 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.953012 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:41 crc kubenswrapper[4922]: I1122 03:08:41.975589 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/121927b8-f52a-4b01-89c1-85b1e694906a-kube-api-access-r658v\") pod \"dnsmasq-dns-86db49b7ff-l6qxq\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.025063 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.088877 4922 generic.go:334] "Generic (PLEG): container finished" podID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerID="5ee6c1256e34c4706b359e48ba3d7acd1a7c21b3019e68947071dd942f8152a5" exitCode=0 Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.089009 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" event={"ID":"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020","Type":"ContainerDied","Data":"5ee6c1256e34c4706b359e48ba3d7acd1a7c21b3019e68947071dd942f8152a5"} Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.121206 4922 generic.go:334] "Generic (PLEG): container finished" podID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerID="e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2" exitCode=0 Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.123428 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.124106 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" event={"ID":"e642b573-6e0b-4897-b1c8-1244683b0b73","Type":"ContainerDied","Data":"e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2"} Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.124184 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-wg4v5" event={"ID":"e642b573-6e0b-4897-b1c8-1244683b0b73","Type":"ContainerDied","Data":"992c93ba37b37f1496910a749de78af1070c12477072152eea6c606617cd5948"} Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.124214 4922 scope.go:117] "RemoveContainer" containerID="e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.142891 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.151460 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxqwz\" (UniqueName: \"kubernetes.io/projected/e642b573-6e0b-4897-b1c8-1244683b0b73-kube-api-access-dxqwz\") pod \"e642b573-6e0b-4897-b1c8-1244683b0b73\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.151560 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-config\") pod \"e642b573-6e0b-4897-b1c8-1244683b0b73\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.151750 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-dns-svc\") pod \"e642b573-6e0b-4897-b1c8-1244683b0b73\" (UID: \"e642b573-6e0b-4897-b1c8-1244683b0b73\") " Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.159241 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.159543 4922 scope.go:117] "RemoveContainer" containerID="c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.162203 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e642b573-6e0b-4897-b1c8-1244683b0b73-kube-api-access-dxqwz" (OuterVolumeSpecName: "kube-api-access-dxqwz") pod "e642b573-6e0b-4897-b1c8-1244683b0b73" (UID: "e642b573-6e0b-4897-b1c8-1244683b0b73"). InnerVolumeSpecName "kube-api-access-dxqwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.173299 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.197301 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e642b573-6e0b-4897-b1c8-1244683b0b73" (UID: "e642b573-6e0b-4897-b1c8-1244683b0b73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.210158 4922 scope.go:117] "RemoveContainer" containerID="e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2" Nov 22 03:08:42 crc kubenswrapper[4922]: E1122 03:08:42.211285 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2\": container with ID starting with e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2 not found: ID does not exist" containerID="e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.211332 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2"} err="failed to get container status \"e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2\": rpc error: code = NotFound desc = could not find container \"e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2\": container with ID starting with e3af5a5c1c7d69ebe01a1af28bd49e10fab281aa8f4c13a435ccf5d018a1e5f2 not found: ID does not exist" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.211361 4922 scope.go:117] "RemoveContainer" containerID="c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4" Nov 22 03:08:42 crc kubenswrapper[4922]: E1122 03:08:42.211981 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4\": container with ID starting with c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4 not found: ID does not exist" containerID="c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.212010 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4"} err="failed to get container status \"c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4\": rpc error: code = NotFound desc = could not find container \"c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4\": container with ID starting with c7c62355fa846c7211fd3ffcf0b4662ea6c663553afecf9966eaf96b696bcac4 not found: ID does not exist" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.229614 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-config" (OuterVolumeSpecName: "config") pod "e642b573-6e0b-4897-b1c8-1244683b0b73" (UID: "e642b573-6e0b-4897-b1c8-1244683b0b73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.234643 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.255634 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-dns-svc\") pod \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.255696 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-config\") pod \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.255803 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5fjc\" (UniqueName: \"kubernetes.io/projected/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-kube-api-access-k5fjc\") pod \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\" (UID: \"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020\") " Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.256563 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxqwz\" (UniqueName: \"kubernetes.io/projected/e642b573-6e0b-4897-b1c8-1244683b0b73-kube-api-access-dxqwz\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.256583 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.256596 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e642b573-6e0b-4897-b1c8-1244683b0b73-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.260861 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-kube-api-access-k5fjc" (OuterVolumeSpecName: "kube-api-access-k5fjc") pod "cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" (UID: "cef5a6d7-f3cf-459b-9e5a-a6a328d0e020"). InnerVolumeSpecName "kube-api-access-k5fjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.297715 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-config" (OuterVolumeSpecName: "config") pod "cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" (UID: "cef5a6d7-f3cf-459b-9e5a-a6a328d0e020"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.305245 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" (UID: "cef5a6d7-f3cf-459b-9e5a-a6a328d0e020"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.359460 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.359494 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5fjc\" (UniqueName: \"kubernetes.io/projected/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-kube-api-access-k5fjc\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.359505 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.451902 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wg4v5"] Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.456672 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-wg4v5"] Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.480324 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-997qf"] Nov 22 03:08:42 crc kubenswrapper[4922]: W1122 03:08:42.484232 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3c54954_22ce_4674_857c_72d2dce55910.slice/crio-a2571d056a2575944a88edaea0416c9790059791e9f2889da807e945bcf39a32 WatchSource:0}: Error finding container a2571d056a2575944a88edaea0416c9790059791e9f2889da807e945bcf39a32: Status 404 returned error can't find the container with id a2571d056a2575944a88edaea0416c9790059791e9f2889da807e945bcf39a32 Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.546209 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hdkg5"] Nov 22 03:08:42 crc kubenswrapper[4922]: W1122 03:08:42.549308 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5043af03_d8a1_4437_9ab8_78907d742588.slice/crio-63e0718cf3658823ebfd191c1450bd374bd11f933f83361455321e57146eec78 WatchSource:0}: Error finding container 63e0718cf3658823ebfd191c1450bd374bd11f933f83361455321e57146eec78: Status 404 returned error can't find the container with id 63e0718cf3658823ebfd191c1450bd374bd11f933f83361455321e57146eec78 Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.628164 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l6qxq"] Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.776959 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:42 crc kubenswrapper[4922]: I1122 03:08:42.777761 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.135391 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" event={"ID":"cef5a6d7-f3cf-459b-9e5a-a6a328d0e020","Type":"ContainerDied","Data":"acd9339a81d7efe4b1f7892fb6737bf7d2e9e58bf7e14d5184f6c4099e085f7e"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.135713 4922 scope.go:117] "RemoveContainer" containerID="5ee6c1256e34c4706b359e48ba3d7acd1a7c21b3019e68947071dd942f8152a5" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.135791 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-94wtw" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.141186 4922 generic.go:334] "Generic (PLEG): container finished" podID="121927b8-f52a-4b01-89c1-85b1e694906a" containerID="00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5" exitCode=0 Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.141307 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" event={"ID":"121927b8-f52a-4b01-89c1-85b1e694906a","Type":"ContainerDied","Data":"00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.141373 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" event={"ID":"121927b8-f52a-4b01-89c1-85b1e694906a","Type":"ContainerStarted","Data":"f6b79a5b818e825ea321f33f4d4273f9a88cdaf6cf47902035654aa7e7570d23"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.150159 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hdkg5" event={"ID":"5043af03-d8a1-4437-9ab8-78907d742588","Type":"ContainerStarted","Data":"48e60ab597f9bc3b5a31ab6a68ac0101c2eb09604c1a62fb9aced2f8d18719c2"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.150201 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hdkg5" event={"ID":"5043af03-d8a1-4437-9ab8-78907d742588","Type":"ContainerStarted","Data":"63e0718cf3658823ebfd191c1450bd374bd11f933f83361455321e57146eec78"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.153713 4922 generic.go:334] "Generic (PLEG): container finished" podID="c3c54954-22ce-4674-857c-72d2dce55910" containerID="74c791f0c27f0f6c251f54a96fef1c7a90cf8c8d566e575f4f5c5deeedd77935" exitCode=0 Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.154099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" event={"ID":"c3c54954-22ce-4674-857c-72d2dce55910","Type":"ContainerDied","Data":"74c791f0c27f0f6c251f54a96fef1c7a90cf8c8d566e575f4f5c5deeedd77935"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.154187 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" event={"ID":"c3c54954-22ce-4674-857c-72d2dce55910","Type":"ContainerStarted","Data":"a2571d056a2575944a88edaea0416c9790059791e9f2889da807e945bcf39a32"} Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.157947 4922 scope.go:117] "RemoveContainer" containerID="3118a65327d78da8bd65c0dcc72e91853521a329e858453fdf9f1daf4b7523af" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.194727 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hdkg5" podStartSLOduration=2.194704657 podStartE2EDuration="2.194704657s" podCreationTimestamp="2025-11-22 03:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:08:43.188330283 +0000 UTC m=+959.226852175" watchObservedRunningTime="2025-11-22 03:08:43.194704657 +0000 UTC m=+959.233226559" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.236188 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.294014 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.319383 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" path="/var/lib/kubelet/pods/e642b573-6e0b-4897-b1c8-1244683b0b73/volumes" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.346808 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-94wtw"] Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.355629 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-94wtw"] Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.479910 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 22 03:08:43 crc kubenswrapper[4922]: E1122 03:08:43.480341 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerName="init" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.480362 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerName="init" Nov 22 03:08:43 crc kubenswrapper[4922]: E1122 03:08:43.480372 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerName="dnsmasq-dns" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.480379 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerName="dnsmasq-dns" Nov 22 03:08:43 crc kubenswrapper[4922]: E1122 03:08:43.480389 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerName="init" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.480396 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerName="init" Nov 22 03:08:43 crc kubenswrapper[4922]: E1122 03:08:43.480427 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerName="dnsmasq-dns" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.480433 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerName="dnsmasq-dns" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.480632 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" containerName="dnsmasq-dns" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.480649 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e642b573-6e0b-4897-b1c8-1244683b0b73" containerName="dnsmasq-dns" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.481698 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.487337 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.487455 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.487623 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-nh49b" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.488516 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.501730 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.591819 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/59de3215-8a02-4c17-9ccc-395c94f69512-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.591908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.591966 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.592009 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59de3215-8a02-4c17-9ccc-395c94f69512-scripts\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.592046 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r42f5\" (UniqueName: \"kubernetes.io/projected/59de3215-8a02-4c17-9ccc-395c94f69512-kube-api-access-r42f5\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.592082 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.592148 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59de3215-8a02-4c17-9ccc-395c94f69512-config\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.694910 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.695014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59de3215-8a02-4c17-9ccc-395c94f69512-scripts\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.695068 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r42f5\" (UniqueName: \"kubernetes.io/projected/59de3215-8a02-4c17-9ccc-395c94f69512-kube-api-access-r42f5\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.695121 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.695183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59de3215-8a02-4c17-9ccc-395c94f69512-config\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.695250 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/59de3215-8a02-4c17-9ccc-395c94f69512-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.695296 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.696160 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59de3215-8a02-4c17-9ccc-395c94f69512-config\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.696416 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59de3215-8a02-4c17-9ccc-395c94f69512-scripts\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.696610 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/59de3215-8a02-4c17-9ccc-395c94f69512-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.703003 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.703089 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.704588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59de3215-8a02-4c17-9ccc-395c94f69512-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.712234 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r42f5\" (UniqueName: \"kubernetes.io/projected/59de3215-8a02-4c17-9ccc-395c94f69512-kube-api-access-r42f5\") pod \"ovn-northd-0\" (UID: \"59de3215-8a02-4c17-9ccc-395c94f69512\") " pod="openstack/ovn-northd-0" Nov 22 03:08:43 crc kubenswrapper[4922]: I1122 03:08:43.805352 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.166541 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" event={"ID":"121927b8-f52a-4b01-89c1-85b1e694906a","Type":"ContainerStarted","Data":"38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3"} Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.167083 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.169279 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" event={"ID":"c3c54954-22ce-4674-857c-72d2dce55910","Type":"ContainerStarted","Data":"f2132549d2f271f23dcd0651695be5dc9f87a2c1059a8e1a4eef0a8e739eef51"} Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.169539 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.192455 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" podStartSLOduration=3.192432417 podStartE2EDuration="3.192432417s" podCreationTimestamp="2025-11-22 03:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:08:44.184708232 +0000 UTC m=+960.223230144" watchObservedRunningTime="2025-11-22 03:08:44.192432417 +0000 UTC m=+960.230954309" Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.212787 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" podStartSLOduration=3.212736635 podStartE2EDuration="3.212736635s" podCreationTimestamp="2025-11-22 03:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:08:44.210128942 +0000 UTC m=+960.248650844" watchObservedRunningTime="2025-11-22 03:08:44.212736635 +0000 UTC m=+960.251258527" Nov 22 03:08:44 crc kubenswrapper[4922]: I1122 03:08:44.353038 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 03:08:44 crc kubenswrapper[4922]: W1122 03:08:44.357921 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59de3215_8a02_4c17_9ccc_395c94f69512.slice/crio-c1bfa3ba632af058ffcb1bb704c5c21039975ddd5051931132294a60dd2c336b WatchSource:0}: Error finding container c1bfa3ba632af058ffcb1bb704c5c21039975ddd5051931132294a60dd2c336b: Status 404 returned error can't find the container with id c1bfa3ba632af058ffcb1bb704c5c21039975ddd5051931132294a60dd2c336b Nov 22 03:08:45 crc kubenswrapper[4922]: I1122 03:08:45.079131 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 03:08:45 crc kubenswrapper[4922]: I1122 03:08:45.182394 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"59de3215-8a02-4c17-9ccc-395c94f69512","Type":"ContainerStarted","Data":"c1bfa3ba632af058ffcb1bb704c5c21039975ddd5051931132294a60dd2c336b"} Nov 22 03:08:45 crc kubenswrapper[4922]: I1122 03:08:45.313189 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef5a6d7-f3cf-459b-9e5a-a6a328d0e020" path="/var/lib/kubelet/pods/cef5a6d7-f3cf-459b-9e5a-a6a328d0e020/volumes" Nov 22 03:08:45 crc kubenswrapper[4922]: I1122 03:08:45.359005 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:45 crc kubenswrapper[4922]: I1122 03:08:45.407257 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 22 03:08:46 crc kubenswrapper[4922]: I1122 03:08:46.194275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"59de3215-8a02-4c17-9ccc-395c94f69512","Type":"ContainerStarted","Data":"851d93a09b22041e35cb2a6f337ca583d4a205a3cedda601d9339056d4147f05"} Nov 22 03:08:46 crc kubenswrapper[4922]: I1122 03:08:46.194976 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 22 03:08:46 crc kubenswrapper[4922]: I1122 03:08:46.194990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"59de3215-8a02-4c17-9ccc-395c94f69512","Type":"ContainerStarted","Data":"648fad439fc2e2b0200730917f5c119a9cb3d36aa22c12ee7eb528b0f46cca01"} Nov 22 03:08:46 crc kubenswrapper[4922]: I1122 03:08:46.216370 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.171876648 podStartE2EDuration="3.216352412s" podCreationTimestamp="2025-11-22 03:08:43 +0000 UTC" firstStartedPulling="2025-11-22 03:08:44.362239556 +0000 UTC m=+960.400761438" lastFinishedPulling="2025-11-22 03:08:45.40671531 +0000 UTC m=+961.445237202" observedRunningTime="2025-11-22 03:08:46.213805481 +0000 UTC m=+962.252327373" watchObservedRunningTime="2025-11-22 03:08:46.216352412 +0000 UTC m=+962.254874304" Nov 22 03:08:47 crc kubenswrapper[4922]: I1122 03:08:47.411553 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 22 03:08:47 crc kubenswrapper[4922]: I1122 03:08:47.461890 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 22 03:08:51 crc kubenswrapper[4922]: I1122 03:08:51.923287 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:52 crc kubenswrapper[4922]: I1122 03:08:52.176063 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:08:52 crc kubenswrapper[4922]: I1122 03:08:52.229405 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-997qf"] Nov 22 03:08:52 crc kubenswrapper[4922]: I1122 03:08:52.260243 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" podUID="c3c54954-22ce-4674-857c-72d2dce55910" containerName="dnsmasq-dns" containerID="cri-o://f2132549d2f271f23dcd0651695be5dc9f87a2c1059a8e1a4eef0a8e739eef51" gracePeriod=10 Nov 22 03:08:52 crc kubenswrapper[4922]: I1122 03:08:52.990818 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vd5np"] Nov 22 03:08:52 crc kubenswrapper[4922]: I1122 03:08:52.991989 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.001615 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vd5np"] Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.096122 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnq2b\" (UniqueName: \"kubernetes.io/projected/95e38312-9328-45d0-80de-9d4a3e94b309-kube-api-access-vnq2b\") pod \"keystone-db-create-vd5np\" (UID: \"95e38312-9328-45d0-80de-9d4a3e94b309\") " pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.198113 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnq2b\" (UniqueName: \"kubernetes.io/projected/95e38312-9328-45d0-80de-9d4a3e94b309-kube-api-access-vnq2b\") pod \"keystone-db-create-vd5np\" (UID: \"95e38312-9328-45d0-80de-9d4a3e94b309\") " pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.219102 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnq2b\" (UniqueName: \"kubernetes.io/projected/95e38312-9328-45d0-80de-9d4a3e94b309-kube-api-access-vnq2b\") pod \"keystone-db-create-vd5np\" (UID: \"95e38312-9328-45d0-80de-9d4a3e94b309\") " pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.280604 4922 generic.go:334] "Generic (PLEG): container finished" podID="c3c54954-22ce-4674-857c-72d2dce55910" containerID="f2132549d2f271f23dcd0651695be5dc9f87a2c1059a8e1a4eef0a8e739eef51" exitCode=0 Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.280677 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" event={"ID":"c3c54954-22ce-4674-857c-72d2dce55910","Type":"ContainerDied","Data":"f2132549d2f271f23dcd0651695be5dc9f87a2c1059a8e1a4eef0a8e739eef51"} Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.280709 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" event={"ID":"c3c54954-22ce-4674-857c-72d2dce55910","Type":"ContainerDied","Data":"a2571d056a2575944a88edaea0416c9790059791e9f2889da807e945bcf39a32"} Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.280722 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2571d056a2575944a88edaea0416c9790059791e9f2889da807e945bcf39a32" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.299325 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.321716 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.369250 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7pzg4"] Nov 22 03:08:53 crc kubenswrapper[4922]: E1122 03:08:53.370694 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c54954-22ce-4674-857c-72d2dce55910" containerName="dnsmasq-dns" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.370876 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c54954-22ce-4674-857c-72d2dce55910" containerName="dnsmasq-dns" Nov 22 03:08:53 crc kubenswrapper[4922]: E1122 03:08:53.371000 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c54954-22ce-4674-857c-72d2dce55910" containerName="init" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.371082 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c54954-22ce-4674-857c-72d2dce55910" containerName="init" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.371384 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c54954-22ce-4674-857c-72d2dce55910" containerName="dnsmasq-dns" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.372181 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7pzg4"] Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.372380 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.402026 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-ovsdbserver-sb\") pod \"c3c54954-22ce-4674-857c-72d2dce55910\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.402132 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-dns-svc\") pod \"c3c54954-22ce-4674-857c-72d2dce55910\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.402273 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-config\") pod \"c3c54954-22ce-4674-857c-72d2dce55910\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.402314 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk2qp\" (UniqueName: \"kubernetes.io/projected/c3c54954-22ce-4674-857c-72d2dce55910-kube-api-access-sk2qp\") pod \"c3c54954-22ce-4674-857c-72d2dce55910\" (UID: \"c3c54954-22ce-4674-857c-72d2dce55910\") " Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.402749 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr4zk\" (UniqueName: \"kubernetes.io/projected/f21024fc-9d0a-42e7-932f-ecd2d648d975-kube-api-access-pr4zk\") pod \"placement-db-create-7pzg4\" (UID: \"f21024fc-9d0a-42e7-932f-ecd2d648d975\") " pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.407098 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c54954-22ce-4674-857c-72d2dce55910-kube-api-access-sk2qp" (OuterVolumeSpecName: "kube-api-access-sk2qp") pod "c3c54954-22ce-4674-857c-72d2dce55910" (UID: "c3c54954-22ce-4674-857c-72d2dce55910"). InnerVolumeSpecName "kube-api-access-sk2qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.437692 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-56hzv"] Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.440669 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-56hzv" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.454959 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-56hzv"] Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.472240 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3c54954-22ce-4674-857c-72d2dce55910" (UID: "c3c54954-22ce-4674-857c-72d2dce55910"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.477539 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-config" (OuterVolumeSpecName: "config") pod "c3c54954-22ce-4674-857c-72d2dce55910" (UID: "c3c54954-22ce-4674-857c-72d2dce55910"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.498510 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3c54954-22ce-4674-857c-72d2dce55910" (UID: "c3c54954-22ce-4674-857c-72d2dce55910"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.504218 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgm66\" (UniqueName: \"kubernetes.io/projected/7ca3bda2-7048-4e36-99af-fa68021dffee-kube-api-access-tgm66\") pod \"glance-db-create-56hzv\" (UID: \"7ca3bda2-7048-4e36-99af-fa68021dffee\") " pod="openstack/glance-db-create-56hzv" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.504257 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr4zk\" (UniqueName: \"kubernetes.io/projected/f21024fc-9d0a-42e7-932f-ecd2d648d975-kube-api-access-pr4zk\") pod \"placement-db-create-7pzg4\" (UID: \"f21024fc-9d0a-42e7-932f-ecd2d648d975\") " pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.504313 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.504325 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk2qp\" (UniqueName: \"kubernetes.io/projected/c3c54954-22ce-4674-857c-72d2dce55910-kube-api-access-sk2qp\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.504335 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.504344 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3c54954-22ce-4674-857c-72d2dce55910-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.520799 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr4zk\" (UniqueName: \"kubernetes.io/projected/f21024fc-9d0a-42e7-932f-ecd2d648d975-kube-api-access-pr4zk\") pod \"placement-db-create-7pzg4\" (UID: \"f21024fc-9d0a-42e7-932f-ecd2d648d975\") " pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.528880 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.605800 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgm66\" (UniqueName: \"kubernetes.io/projected/7ca3bda2-7048-4e36-99af-fa68021dffee-kube-api-access-tgm66\") pod \"glance-db-create-56hzv\" (UID: \"7ca3bda2-7048-4e36-99af-fa68021dffee\") " pod="openstack/glance-db-create-56hzv" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.628295 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgm66\" (UniqueName: \"kubernetes.io/projected/7ca3bda2-7048-4e36-99af-fa68021dffee-kube-api-access-tgm66\") pod \"glance-db-create-56hzv\" (UID: \"7ca3bda2-7048-4e36-99af-fa68021dffee\") " pod="openstack/glance-db-create-56hzv" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.829073 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vd5np"] Nov 22 03:08:53 crc kubenswrapper[4922]: W1122 03:08:53.840367 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95e38312_9328_45d0_80de_9d4a3e94b309.slice/crio-2407af6a7d78cd145519c04a5bd0b87f167f8ae036383240e1e46ed3e43253b3 WatchSource:0}: Error finding container 2407af6a7d78cd145519c04a5bd0b87f167f8ae036383240e1e46ed3e43253b3: Status 404 returned error can't find the container with id 2407af6a7d78cd145519c04a5bd0b87f167f8ae036383240e1e46ed3e43253b3 Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.843373 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-56hzv" Nov 22 03:08:53 crc kubenswrapper[4922]: I1122 03:08:53.967546 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7pzg4"] Nov 22 03:08:53 crc kubenswrapper[4922]: W1122 03:08:53.975099 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf21024fc_9d0a_42e7_932f_ecd2d648d975.slice/crio-24959257d68e4b6d92a46d1af4f4e34a81a0c9e28c60233a00c8fe785afc01ce WatchSource:0}: Error finding container 24959257d68e4b6d92a46d1af4f4e34a81a0c9e28c60233a00c8fe785afc01ce: Status 404 returned error can't find the container with id 24959257d68e4b6d92a46d1af4f4e34a81a0c9e28c60233a00c8fe785afc01ce Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.290168 4922 generic.go:334] "Generic (PLEG): container finished" podID="f21024fc-9d0a-42e7-932f-ecd2d648d975" containerID="28c9cac5320bb3f2a7604588d4cd72ee5f68a9ac3641bfc994efc50207765f38" exitCode=0 Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.290206 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pzg4" event={"ID":"f21024fc-9d0a-42e7-932f-ecd2d648d975","Type":"ContainerDied","Data":"28c9cac5320bb3f2a7604588d4cd72ee5f68a9ac3641bfc994efc50207765f38"} Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.290244 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pzg4" event={"ID":"f21024fc-9d0a-42e7-932f-ecd2d648d975","Type":"ContainerStarted","Data":"24959257d68e4b6d92a46d1af4f4e34a81a0c9e28c60233a00c8fe785afc01ce"} Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.291939 4922 generic.go:334] "Generic (PLEG): container finished" podID="95e38312-9328-45d0-80de-9d4a3e94b309" containerID="49af4be13e64c37e69ecbd10922c5af17213be191c4fe4282c7036a69fc06404" exitCode=0 Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.292006 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-997qf" Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.292612 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vd5np" event={"ID":"95e38312-9328-45d0-80de-9d4a3e94b309","Type":"ContainerDied","Data":"49af4be13e64c37e69ecbd10922c5af17213be191c4fe4282c7036a69fc06404"} Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.292642 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vd5np" event={"ID":"95e38312-9328-45d0-80de-9d4a3e94b309","Type":"ContainerStarted","Data":"2407af6a7d78cd145519c04a5bd0b87f167f8ae036383240e1e46ed3e43253b3"} Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.307511 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-56hzv"] Nov 22 03:08:54 crc kubenswrapper[4922]: W1122 03:08:54.320107 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ca3bda2_7048_4e36_99af_fa68021dffee.slice/crio-8adae7e6b5e29e64ac87ca6a6b26989d1485f5b02cc4e3d0992f6fe2ae1430f5 WatchSource:0}: Error finding container 8adae7e6b5e29e64ac87ca6a6b26989d1485f5b02cc4e3d0992f6fe2ae1430f5: Status 404 returned error can't find the container with id 8adae7e6b5e29e64ac87ca6a6b26989d1485f5b02cc4e3d0992f6fe2ae1430f5 Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.350207 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-997qf"] Nov 22 03:08:54 crc kubenswrapper[4922]: I1122 03:08:54.357760 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-997qf"] Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.308511 4922 generic.go:334] "Generic (PLEG): container finished" podID="7ca3bda2-7048-4e36-99af-fa68021dffee" containerID="4c7f81b5ea191cceb9b242326dd29bc72cc51fde6a7951d5413fd6ca3adf1014" exitCode=0 Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.320336 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c54954-22ce-4674-857c-72d2dce55910" path="/var/lib/kubelet/pods/c3c54954-22ce-4674-857c-72d2dce55910/volumes" Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.320969 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-56hzv" event={"ID":"7ca3bda2-7048-4e36-99af-fa68021dffee","Type":"ContainerDied","Data":"4c7f81b5ea191cceb9b242326dd29bc72cc51fde6a7951d5413fd6ca3adf1014"} Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.320995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-56hzv" event={"ID":"7ca3bda2-7048-4e36-99af-fa68021dffee","Type":"ContainerStarted","Data":"8adae7e6b5e29e64ac87ca6a6b26989d1485f5b02cc4e3d0992f6fe2ae1430f5"} Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.845460 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.857394 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.962504 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr4zk\" (UniqueName: \"kubernetes.io/projected/f21024fc-9d0a-42e7-932f-ecd2d648d975-kube-api-access-pr4zk\") pod \"f21024fc-9d0a-42e7-932f-ecd2d648d975\" (UID: \"f21024fc-9d0a-42e7-932f-ecd2d648d975\") " Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.962607 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnq2b\" (UniqueName: \"kubernetes.io/projected/95e38312-9328-45d0-80de-9d4a3e94b309-kube-api-access-vnq2b\") pod \"95e38312-9328-45d0-80de-9d4a3e94b309\" (UID: \"95e38312-9328-45d0-80de-9d4a3e94b309\") " Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.971617 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e38312-9328-45d0-80de-9d4a3e94b309-kube-api-access-vnq2b" (OuterVolumeSpecName: "kube-api-access-vnq2b") pod "95e38312-9328-45d0-80de-9d4a3e94b309" (UID: "95e38312-9328-45d0-80de-9d4a3e94b309"). InnerVolumeSpecName "kube-api-access-vnq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:55 crc kubenswrapper[4922]: I1122 03:08:55.972383 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21024fc-9d0a-42e7-932f-ecd2d648d975-kube-api-access-pr4zk" (OuterVolumeSpecName: "kube-api-access-pr4zk") pod "f21024fc-9d0a-42e7-932f-ecd2d648d975" (UID: "f21024fc-9d0a-42e7-932f-ecd2d648d975"). InnerVolumeSpecName "kube-api-access-pr4zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.065015 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr4zk\" (UniqueName: \"kubernetes.io/projected/f21024fc-9d0a-42e7-932f-ecd2d648d975-kube-api-access-pr4zk\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.065079 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnq2b\" (UniqueName: \"kubernetes.io/projected/95e38312-9328-45d0-80de-9d4a3e94b309-kube-api-access-vnq2b\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.322905 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7pzg4" event={"ID":"f21024fc-9d0a-42e7-932f-ecd2d648d975","Type":"ContainerDied","Data":"24959257d68e4b6d92a46d1af4f4e34a81a0c9e28c60233a00c8fe785afc01ce"} Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.322949 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24959257d68e4b6d92a46d1af4f4e34a81a0c9e28c60233a00c8fe785afc01ce" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.323009 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7pzg4" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.328547 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vd5np" event={"ID":"95e38312-9328-45d0-80de-9d4a3e94b309","Type":"ContainerDied","Data":"2407af6a7d78cd145519c04a5bd0b87f167f8ae036383240e1e46ed3e43253b3"} Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.328687 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vd5np" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.328754 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2407af6a7d78cd145519c04a5bd0b87f167f8ae036383240e1e46ed3e43253b3" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.756913 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-56hzv" Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.911646 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgm66\" (UniqueName: \"kubernetes.io/projected/7ca3bda2-7048-4e36-99af-fa68021dffee-kube-api-access-tgm66\") pod \"7ca3bda2-7048-4e36-99af-fa68021dffee\" (UID: \"7ca3bda2-7048-4e36-99af-fa68021dffee\") " Nov 22 03:08:56 crc kubenswrapper[4922]: I1122 03:08:56.918391 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca3bda2-7048-4e36-99af-fa68021dffee-kube-api-access-tgm66" (OuterVolumeSpecName: "kube-api-access-tgm66") pod "7ca3bda2-7048-4e36-99af-fa68021dffee" (UID: "7ca3bda2-7048-4e36-99af-fa68021dffee"). InnerVolumeSpecName "kube-api-access-tgm66". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:08:57 crc kubenswrapper[4922]: I1122 03:08:57.014161 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgm66\" (UniqueName: \"kubernetes.io/projected/7ca3bda2-7048-4e36-99af-fa68021dffee-kube-api-access-tgm66\") on node \"crc\" DevicePath \"\"" Nov 22 03:08:57 crc kubenswrapper[4922]: I1122 03:08:57.340450 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-56hzv" event={"ID":"7ca3bda2-7048-4e36-99af-fa68021dffee","Type":"ContainerDied","Data":"8adae7e6b5e29e64ac87ca6a6b26989d1485f5b02cc4e3d0992f6fe2ae1430f5"} Nov 22 03:08:57 crc kubenswrapper[4922]: I1122 03:08:57.340509 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8adae7e6b5e29e64ac87ca6a6b26989d1485f5b02cc4e3d0992f6fe2ae1430f5" Nov 22 03:08:57 crc kubenswrapper[4922]: I1122 03:08:57.340559 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-56hzv" Nov 22 03:08:58 crc kubenswrapper[4922]: I1122 03:08:58.895487 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.039316 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-be68-account-create-jx25h"] Nov 22 03:09:03 crc kubenswrapper[4922]: E1122 03:09:03.040420 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21024fc-9d0a-42e7-932f-ecd2d648d975" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.040440 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21024fc-9d0a-42e7-932f-ecd2d648d975" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: E1122 03:09:03.040477 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca3bda2-7048-4e36-99af-fa68021dffee" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.040490 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca3bda2-7048-4e36-99af-fa68021dffee" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: E1122 03:09:03.040514 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95e38312-9328-45d0-80de-9d4a3e94b309" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.040526 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e38312-9328-45d0-80de-9d4a3e94b309" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.040788 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca3bda2-7048-4e36-99af-fa68021dffee" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.040816 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="95e38312-9328-45d0-80de-9d4a3e94b309" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.040891 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21024fc-9d0a-42e7-932f-ecd2d648d975" containerName="mariadb-database-create" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.041703 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.044318 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.051953 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-be68-account-create-jx25h"] Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.157120 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqnw9\" (UniqueName: \"kubernetes.io/projected/f4af2488-ac01-43c3-9c7f-672a2d20456b-kube-api-access-rqnw9\") pod \"keystone-be68-account-create-jx25h\" (UID: \"f4af2488-ac01-43c3-9c7f-672a2d20456b\") " pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.259054 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqnw9\" (UniqueName: \"kubernetes.io/projected/f4af2488-ac01-43c3-9c7f-672a2d20456b-kube-api-access-rqnw9\") pod \"keystone-be68-account-create-jx25h\" (UID: \"f4af2488-ac01-43c3-9c7f-672a2d20456b\") " pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.291634 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqnw9\" (UniqueName: \"kubernetes.io/projected/f4af2488-ac01-43c3-9c7f-672a2d20456b-kube-api-access-rqnw9\") pod \"keystone-be68-account-create-jx25h\" (UID: \"f4af2488-ac01-43c3-9c7f-672a2d20456b\") " pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.416111 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.505562 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4209-account-create-kvqs9"] Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.508424 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.511055 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.538134 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4209-account-create-kvqs9"] Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.665555 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkgp7\" (UniqueName: \"kubernetes.io/projected/968a0f99-7d65-431f-a242-11ac5c861a27-kube-api-access-dkgp7\") pod \"placement-4209-account-create-kvqs9\" (UID: \"968a0f99-7d65-431f-a242-11ac5c861a27\") " pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.706171 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5adc-account-create-l8gt8"] Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.708015 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.711022 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.714270 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5adc-account-create-l8gt8"] Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.767313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkgp7\" (UniqueName: \"kubernetes.io/projected/968a0f99-7d65-431f-a242-11ac5c861a27-kube-api-access-dkgp7\") pod \"placement-4209-account-create-kvqs9\" (UID: \"968a0f99-7d65-431f-a242-11ac5c861a27\") " pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.790528 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkgp7\" (UniqueName: \"kubernetes.io/projected/968a0f99-7d65-431f-a242-11ac5c861a27-kube-api-access-dkgp7\") pod \"placement-4209-account-create-kvqs9\" (UID: \"968a0f99-7d65-431f-a242-11ac5c861a27\") " pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.842419 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.870022 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57z48\" (UniqueName: \"kubernetes.io/projected/406317f3-f5ff-45bb-bc58-847129dd5652-kube-api-access-57z48\") pod \"glance-5adc-account-create-l8gt8\" (UID: \"406317f3-f5ff-45bb-bc58-847129dd5652\") " pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.902585 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-be68-account-create-jx25h"] Nov 22 03:09:03 crc kubenswrapper[4922]: W1122 03:09:03.907309 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4af2488_ac01_43c3_9c7f_672a2d20456b.slice/crio-c7b0ed586fce190286a6d4a70f25ffcb9b31810793654f01cffaf1e657ec22db WatchSource:0}: Error finding container c7b0ed586fce190286a6d4a70f25ffcb9b31810793654f01cffaf1e657ec22db: Status 404 returned error can't find the container with id c7b0ed586fce190286a6d4a70f25ffcb9b31810793654f01cffaf1e657ec22db Nov 22 03:09:03 crc kubenswrapper[4922]: I1122 03:09:03.972275 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57z48\" (UniqueName: \"kubernetes.io/projected/406317f3-f5ff-45bb-bc58-847129dd5652-kube-api-access-57z48\") pod \"glance-5adc-account-create-l8gt8\" (UID: \"406317f3-f5ff-45bb-bc58-847129dd5652\") " pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:04 crc kubenswrapper[4922]: I1122 03:09:04.004403 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57z48\" (UniqueName: \"kubernetes.io/projected/406317f3-f5ff-45bb-bc58-847129dd5652-kube-api-access-57z48\") pod \"glance-5adc-account-create-l8gt8\" (UID: \"406317f3-f5ff-45bb-bc58-847129dd5652\") " pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:04 crc kubenswrapper[4922]: I1122 03:09:04.032712 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:04 crc kubenswrapper[4922]: I1122 03:09:04.337985 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4209-account-create-kvqs9"] Nov 22 03:09:04 crc kubenswrapper[4922]: W1122 03:09:04.341010 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod968a0f99_7d65_431f_a242_11ac5c861a27.slice/crio-0182c9762d04221106ed3688e006c6f1d9afbdd2c57f7d445291ff85c88284cd WatchSource:0}: Error finding container 0182c9762d04221106ed3688e006c6f1d9afbdd2c57f7d445291ff85c88284cd: Status 404 returned error can't find the container with id 0182c9762d04221106ed3688e006c6f1d9afbdd2c57f7d445291ff85c88284cd Nov 22 03:09:04 crc kubenswrapper[4922]: I1122 03:09:04.414995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-be68-account-create-jx25h" event={"ID":"f4af2488-ac01-43c3-9c7f-672a2d20456b","Type":"ContainerStarted","Data":"c7b0ed586fce190286a6d4a70f25ffcb9b31810793654f01cffaf1e657ec22db"} Nov 22 03:09:04 crc kubenswrapper[4922]: I1122 03:09:04.417413 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4209-account-create-kvqs9" event={"ID":"968a0f99-7d65-431f-a242-11ac5c861a27","Type":"ContainerStarted","Data":"0182c9762d04221106ed3688e006c6f1d9afbdd2c57f7d445291ff85c88284cd"} Nov 22 03:09:04 crc kubenswrapper[4922]: I1122 03:09:04.502518 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5adc-account-create-l8gt8"] Nov 22 03:09:04 crc kubenswrapper[4922]: W1122 03:09:04.510804 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod406317f3_f5ff_45bb_bc58_847129dd5652.slice/crio-98833492f5f223f89207798ca87b34262477c78930662ba7db2e0a9e1e7793c2 WatchSource:0}: Error finding container 98833492f5f223f89207798ca87b34262477c78930662ba7db2e0a9e1e7793c2: Status 404 returned error can't find the container with id 98833492f5f223f89207798ca87b34262477c78930662ba7db2e0a9e1e7793c2 Nov 22 03:09:05 crc kubenswrapper[4922]: I1122 03:09:05.426926 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5adc-account-create-l8gt8" event={"ID":"406317f3-f5ff-45bb-bc58-847129dd5652","Type":"ContainerStarted","Data":"98833492f5f223f89207798ca87b34262477c78930662ba7db2e0a9e1e7793c2"} Nov 22 03:09:05 crc kubenswrapper[4922]: I1122 03:09:05.429042 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4209-account-create-kvqs9" event={"ID":"968a0f99-7d65-431f-a242-11ac5c861a27","Type":"ContainerStarted","Data":"92eeeddc27b7ed39fe0b4dc384d0856e9fb8ad2ddcf7e4b7de28128b21716052"} Nov 22 03:09:06 crc kubenswrapper[4922]: I1122 03:09:06.442408 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5adc-account-create-l8gt8" event={"ID":"406317f3-f5ff-45bb-bc58-847129dd5652","Type":"ContainerStarted","Data":"c26cbcb5bd09e0351e31b7719f82cb707ea4eee4a590a12f03b8a2dd21c2a50a"} Nov 22 03:09:06 crc kubenswrapper[4922]: I1122 03:09:06.478617 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-5adc-account-create-l8gt8" podStartSLOduration=3.478589264 podStartE2EDuration="3.478589264s" podCreationTimestamp="2025-11-22 03:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:06.469967067 +0000 UTC m=+982.508488989" watchObservedRunningTime="2025-11-22 03:09:06.478589264 +0000 UTC m=+982.517111166" Nov 22 03:09:06 crc kubenswrapper[4922]: I1122 03:09:06.498943 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-4209-account-create-kvqs9" podStartSLOduration=3.498915663 podStartE2EDuration="3.498915663s" podCreationTimestamp="2025-11-22 03:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:06.491580736 +0000 UTC m=+982.530102658" watchObservedRunningTime="2025-11-22 03:09:06.498915663 +0000 UTC m=+982.537437585" Nov 22 03:09:07 crc kubenswrapper[4922]: I1122 03:09:07.453670 4922 generic.go:334] "Generic (PLEG): container finished" podID="968a0f99-7d65-431f-a242-11ac5c861a27" containerID="92eeeddc27b7ed39fe0b4dc384d0856e9fb8ad2ddcf7e4b7de28128b21716052" exitCode=0 Nov 22 03:09:07 crc kubenswrapper[4922]: I1122 03:09:07.453820 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4209-account-create-kvqs9" event={"ID":"968a0f99-7d65-431f-a242-11ac5c861a27","Type":"ContainerDied","Data":"92eeeddc27b7ed39fe0b4dc384d0856e9fb8ad2ddcf7e4b7de28128b21716052"} Nov 22 03:09:07 crc kubenswrapper[4922]: I1122 03:09:07.456017 4922 generic.go:334] "Generic (PLEG): container finished" podID="f4af2488-ac01-43c3-9c7f-672a2d20456b" containerID="947f4126acb81a2bc855f69d83ba89ea605517b75ecf5e91129ba122f0b2bacc" exitCode=0 Nov 22 03:09:07 crc kubenswrapper[4922]: I1122 03:09:07.456102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-be68-account-create-jx25h" event={"ID":"f4af2488-ac01-43c3-9c7f-672a2d20456b","Type":"ContainerDied","Data":"947f4126acb81a2bc855f69d83ba89ea605517b75ecf5e91129ba122f0b2bacc"} Nov 22 03:09:07 crc kubenswrapper[4922]: I1122 03:09:07.458269 4922 generic.go:334] "Generic (PLEG): container finished" podID="406317f3-f5ff-45bb-bc58-847129dd5652" containerID="c26cbcb5bd09e0351e31b7719f82cb707ea4eee4a590a12f03b8a2dd21c2a50a" exitCode=0 Nov 22 03:09:07 crc kubenswrapper[4922]: I1122 03:09:07.458308 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5adc-account-create-l8gt8" event={"ID":"406317f3-f5ff-45bb-bc58-847129dd5652","Type":"ContainerDied","Data":"c26cbcb5bd09e0351e31b7719f82cb707ea4eee4a590a12f03b8a2dd21c2a50a"} Nov 22 03:09:08 crc kubenswrapper[4922]: I1122 03:09:08.738787 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nlzww" podUID="5c038335-42ee-4618-a14b-b32bc0f1d53a" containerName="ovn-controller" probeResult="failure" output=< Nov 22 03:09:08 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 03:09:08 crc kubenswrapper[4922]: > Nov 22 03:09:08 crc kubenswrapper[4922]: I1122 03:09:08.866390 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:08 crc kubenswrapper[4922]: I1122 03:09:08.963727 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkgp7\" (UniqueName: \"kubernetes.io/projected/968a0f99-7d65-431f-a242-11ac5c861a27-kube-api-access-dkgp7\") pod \"968a0f99-7d65-431f-a242-11ac5c861a27\" (UID: \"968a0f99-7d65-431f-a242-11ac5c861a27\") " Nov 22 03:09:08 crc kubenswrapper[4922]: I1122 03:09:08.972047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968a0f99-7d65-431f-a242-11ac5c861a27-kube-api-access-dkgp7" (OuterVolumeSpecName: "kube-api-access-dkgp7") pod "968a0f99-7d65-431f-a242-11ac5c861a27" (UID: "968a0f99-7d65-431f-a242-11ac5c861a27"). InnerVolumeSpecName "kube-api-access-dkgp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.034956 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.041655 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.066236 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkgp7\" (UniqueName: \"kubernetes.io/projected/968a0f99-7d65-431f-a242-11ac5c861a27-kube-api-access-dkgp7\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.167128 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57z48\" (UniqueName: \"kubernetes.io/projected/406317f3-f5ff-45bb-bc58-847129dd5652-kube-api-access-57z48\") pod \"406317f3-f5ff-45bb-bc58-847129dd5652\" (UID: \"406317f3-f5ff-45bb-bc58-847129dd5652\") " Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.167238 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqnw9\" (UniqueName: \"kubernetes.io/projected/f4af2488-ac01-43c3-9c7f-672a2d20456b-kube-api-access-rqnw9\") pod \"f4af2488-ac01-43c3-9c7f-672a2d20456b\" (UID: \"f4af2488-ac01-43c3-9c7f-672a2d20456b\") " Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.172551 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406317f3-f5ff-45bb-bc58-847129dd5652-kube-api-access-57z48" (OuterVolumeSpecName: "kube-api-access-57z48") pod "406317f3-f5ff-45bb-bc58-847129dd5652" (UID: "406317f3-f5ff-45bb-bc58-847129dd5652"). InnerVolumeSpecName "kube-api-access-57z48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.172916 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4af2488-ac01-43c3-9c7f-672a2d20456b-kube-api-access-rqnw9" (OuterVolumeSpecName: "kube-api-access-rqnw9") pod "f4af2488-ac01-43c3-9c7f-672a2d20456b" (UID: "f4af2488-ac01-43c3-9c7f-672a2d20456b"). InnerVolumeSpecName "kube-api-access-rqnw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.269207 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57z48\" (UniqueName: \"kubernetes.io/projected/406317f3-f5ff-45bb-bc58-847129dd5652-kube-api-access-57z48\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.269241 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqnw9\" (UniqueName: \"kubernetes.io/projected/f4af2488-ac01-43c3-9c7f-672a2d20456b-kube-api-access-rqnw9\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.481747 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5adc-account-create-l8gt8" event={"ID":"406317f3-f5ff-45bb-bc58-847129dd5652","Type":"ContainerDied","Data":"98833492f5f223f89207798ca87b34262477c78930662ba7db2e0a9e1e7793c2"} Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.482176 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98833492f5f223f89207798ca87b34262477c78930662ba7db2e0a9e1e7793c2" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.481763 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5adc-account-create-l8gt8" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.484572 4922 generic.go:334] "Generic (PLEG): container finished" podID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerID="f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a" exitCode=0 Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.484910 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11bc873c-bbc5-4033-ad7a-9569c2b6aa76","Type":"ContainerDied","Data":"f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a"} Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.490059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4209-account-create-kvqs9" event={"ID":"968a0f99-7d65-431f-a242-11ac5c861a27","Type":"ContainerDied","Data":"0182c9762d04221106ed3688e006c6f1d9afbdd2c57f7d445291ff85c88284cd"} Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.490272 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0182c9762d04221106ed3688e006c6f1d9afbdd2c57f7d445291ff85c88284cd" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.490494 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4209-account-create-kvqs9" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.506083 4922 generic.go:334] "Generic (PLEG): container finished" podID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerID="4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd" exitCode=0 Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.506160 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"161bd1ea-2276-4a95-b0ad-304cc807d13f","Type":"ContainerDied","Data":"4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd"} Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.509961 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-be68-account-create-jx25h" event={"ID":"f4af2488-ac01-43c3-9c7f-672a2d20456b","Type":"ContainerDied","Data":"c7b0ed586fce190286a6d4a70f25ffcb9b31810793654f01cffaf1e657ec22db"} Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.510018 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7b0ed586fce190286a6d4a70f25ffcb9b31810793654f01cffaf1e657ec22db" Nov 22 03:09:09 crc kubenswrapper[4922]: I1122 03:09:09.510085 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-be68-account-create-jx25h" Nov 22 03:09:10 crc kubenswrapper[4922]: I1122 03:09:10.518974 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"161bd1ea-2276-4a95-b0ad-304cc807d13f","Type":"ContainerStarted","Data":"52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3"} Nov 22 03:09:10 crc kubenswrapper[4922]: I1122 03:09:10.519518 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 03:09:10 crc kubenswrapper[4922]: I1122 03:09:10.520803 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11bc873c-bbc5-4033-ad7a-9569c2b6aa76","Type":"ContainerStarted","Data":"d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131"} Nov 22 03:09:10 crc kubenswrapper[4922]: I1122 03:09:10.521132 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:09:10 crc kubenswrapper[4922]: I1122 03:09:10.542054 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.393521279 podStartE2EDuration="1m2.542023499s" podCreationTimestamp="2025-11-22 03:08:08 +0000 UTC" firstStartedPulling="2025-11-22 03:08:23.111953104 +0000 UTC m=+939.150474996" lastFinishedPulling="2025-11-22 03:08:33.260455334 +0000 UTC m=+949.298977216" observedRunningTime="2025-11-22 03:09:10.5399474 +0000 UTC m=+986.578469292" watchObservedRunningTime="2025-11-22 03:09:10.542023499 +0000 UTC m=+986.580545411" Nov 22 03:09:10 crc kubenswrapper[4922]: I1122 03:09:10.568161 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.740499291 podStartE2EDuration="1m2.568138967s" podCreationTimestamp="2025-11-22 03:08:08 +0000 UTC" firstStartedPulling="2025-11-22 03:08:24.319796413 +0000 UTC m=+940.358318305" lastFinishedPulling="2025-11-22 03:08:33.147436079 +0000 UTC m=+949.185957981" observedRunningTime="2025-11-22 03:09:10.56660249 +0000 UTC m=+986.605124382" watchObservedRunningTime="2025-11-22 03:09:10.568138967 +0000 UTC m=+986.606660899" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.755676 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.761243 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nlzww" podUID="5c038335-42ee-4618-a14b-b32bc0f1d53a" containerName="ovn-controller" probeResult="failure" output=< Nov 22 03:09:13 crc kubenswrapper[4922]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 03:09:13 crc kubenswrapper[4922]: > Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.763567 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-4cpkr" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.874219 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wbr6m"] Nov 22 03:09:13 crc kubenswrapper[4922]: E1122 03:09:13.874710 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406317f3-f5ff-45bb-bc58-847129dd5652" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.874734 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="406317f3-f5ff-45bb-bc58-847129dd5652" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: E1122 03:09:13.874751 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4af2488-ac01-43c3-9c7f-672a2d20456b" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.874758 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4af2488-ac01-43c3-9c7f-672a2d20456b" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: E1122 03:09:13.874787 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968a0f99-7d65-431f-a242-11ac5c861a27" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.874796 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="968a0f99-7d65-431f-a242-11ac5c861a27" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.874984 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="968a0f99-7d65-431f-a242-11ac5c861a27" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.875011 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4af2488-ac01-43c3-9c7f-672a2d20456b" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.875029 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="406317f3-f5ff-45bb-bc58-847129dd5652" containerName="mariadb-account-create" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.875735 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.878230 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mw8pk" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.878428 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.919416 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wbr6m"] Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.972073 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckd56\" (UniqueName: \"kubernetes.io/projected/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-kube-api-access-ckd56\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.972127 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-config-data\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.972195 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-db-sync-config-data\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:13 crc kubenswrapper[4922]: I1122 03:09:13.972217 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-combined-ca-bundle\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.026969 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nlzww-config-vxsmg"] Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.028758 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.031785 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.050660 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlzww-config-vxsmg"] Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.074102 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-db-sync-config-data\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.074171 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-combined-ca-bundle\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.074263 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckd56\" (UniqueName: \"kubernetes.io/projected/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-kube-api-access-ckd56\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.074285 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-config-data\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.082070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-config-data\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.083450 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-db-sync-config-data\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.083545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-combined-ca-bundle\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.109798 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckd56\" (UniqueName: \"kubernetes.io/projected/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-kube-api-access-ckd56\") pod \"glance-db-sync-wbr6m\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.175488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-scripts\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.175637 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run-ovn\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.175680 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxprs\" (UniqueName: \"kubernetes.io/projected/91169f16-5374-4988-bc5d-9332cc672d72-kube-api-access-vxprs\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.175710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-additional-scripts\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.175812 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-log-ovn\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.175924 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.220062 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.277973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-additional-scripts\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.278032 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-log-ovn\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.278074 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.278166 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-scripts\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.278220 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run-ovn\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.278250 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxprs\" (UniqueName: \"kubernetes.io/projected/91169f16-5374-4988-bc5d-9332cc672d72-kube-api-access-vxprs\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.279405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run-ovn\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.279423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-log-ovn\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.279411 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.279522 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-additional-scripts\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.282641 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-scripts\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.300584 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxprs\" (UniqueName: \"kubernetes.io/projected/91169f16-5374-4988-bc5d-9332cc672d72-kube-api-access-vxprs\") pod \"ovn-controller-nlzww-config-vxsmg\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.350328 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.606800 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wbr6m"] Nov 22 03:09:14 crc kubenswrapper[4922]: W1122 03:09:14.614098 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb763fe0e_98d2_4e23_8629_a14f68e3e8b8.slice/crio-27fe60125af8fb4a8eb09fb1e5ea568d6dde611813ff6cec282f2f331f129bfa WatchSource:0}: Error finding container 27fe60125af8fb4a8eb09fb1e5ea568d6dde611813ff6cec282f2f331f129bfa: Status 404 returned error can't find the container with id 27fe60125af8fb4a8eb09fb1e5ea568d6dde611813ff6cec282f2f331f129bfa Nov 22 03:09:14 crc kubenswrapper[4922]: I1122 03:09:14.820666 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nlzww-config-vxsmg"] Nov 22 03:09:15 crc kubenswrapper[4922]: I1122 03:09:15.568336 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlzww-config-vxsmg" event={"ID":"91169f16-5374-4988-bc5d-9332cc672d72","Type":"ContainerStarted","Data":"4923f77f1af12b4b9198a0936369ff3d3a997d7c6abfc51d014cac2563974f0a"} Nov 22 03:09:15 crc kubenswrapper[4922]: I1122 03:09:15.568643 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlzww-config-vxsmg" event={"ID":"91169f16-5374-4988-bc5d-9332cc672d72","Type":"ContainerStarted","Data":"a1c90db942badecfccc8d0398df2685a0888fa9550e509d2911bae8042c47c9c"} Nov 22 03:09:15 crc kubenswrapper[4922]: I1122 03:09:15.569734 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wbr6m" event={"ID":"b763fe0e-98d2-4e23-8629-a14f68e3e8b8","Type":"ContainerStarted","Data":"27fe60125af8fb4a8eb09fb1e5ea568d6dde611813ff6cec282f2f331f129bfa"} Nov 22 03:09:15 crc kubenswrapper[4922]: I1122 03:09:15.590124 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nlzww-config-vxsmg" podStartSLOduration=1.590100201 podStartE2EDuration="1.590100201s" podCreationTimestamp="2025-11-22 03:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:15.589461265 +0000 UTC m=+991.627983167" watchObservedRunningTime="2025-11-22 03:09:15.590100201 +0000 UTC m=+991.628622103" Nov 22 03:09:16 crc kubenswrapper[4922]: I1122 03:09:16.578931 4922 generic.go:334] "Generic (PLEG): container finished" podID="91169f16-5374-4988-bc5d-9332cc672d72" containerID="4923f77f1af12b4b9198a0936369ff3d3a997d7c6abfc51d014cac2563974f0a" exitCode=0 Nov 22 03:09:16 crc kubenswrapper[4922]: I1122 03:09:16.578977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nlzww-config-vxsmg" event={"ID":"91169f16-5374-4988-bc5d-9332cc672d72","Type":"ContainerDied","Data":"4923f77f1af12b4b9198a0936369ff3d3a997d7c6abfc51d014cac2563974f0a"} Nov 22 03:09:17 crc kubenswrapper[4922]: I1122 03:09:17.921832 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.047618 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-scripts\") pod \"91169f16-5374-4988-bc5d-9332cc672d72\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.047701 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run\") pod \"91169f16-5374-4988-bc5d-9332cc672d72\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.047755 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxprs\" (UniqueName: \"kubernetes.io/projected/91169f16-5374-4988-bc5d-9332cc672d72-kube-api-access-vxprs\") pod \"91169f16-5374-4988-bc5d-9332cc672d72\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.047861 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-log-ovn\") pod \"91169f16-5374-4988-bc5d-9332cc672d72\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.047924 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run-ovn\") pod \"91169f16-5374-4988-bc5d-9332cc672d72\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.047999 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-additional-scripts\") pod \"91169f16-5374-4988-bc5d-9332cc672d72\" (UID: \"91169f16-5374-4988-bc5d-9332cc672d72\") " Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.048837 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-scripts" (OuterVolumeSpecName: "scripts") pod "91169f16-5374-4988-bc5d-9332cc672d72" (UID: "91169f16-5374-4988-bc5d-9332cc672d72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.048975 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "91169f16-5374-4988-bc5d-9332cc672d72" (UID: "91169f16-5374-4988-bc5d-9332cc672d72"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.049028 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "91169f16-5374-4988-bc5d-9332cc672d72" (UID: "91169f16-5374-4988-bc5d-9332cc672d72"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.049246 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "91169f16-5374-4988-bc5d-9332cc672d72" (UID: "91169f16-5374-4988-bc5d-9332cc672d72"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.049271 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run" (OuterVolumeSpecName: "var-run") pod "91169f16-5374-4988-bc5d-9332cc672d72" (UID: "91169f16-5374-4988-bc5d-9332cc672d72"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.055653 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91169f16-5374-4988-bc5d-9332cc672d72-kube-api-access-vxprs" (OuterVolumeSpecName: "kube-api-access-vxprs") pod "91169f16-5374-4988-bc5d-9332cc672d72" (UID: "91169f16-5374-4988-bc5d-9332cc672d72"). InnerVolumeSpecName "kube-api-access-vxprs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.150414 4922 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.150452 4922 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.150467 4922 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.150482 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91169f16-5374-4988-bc5d-9332cc672d72-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.150493 4922 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/91169f16-5374-4988-bc5d-9332cc672d72-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.150505 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxprs\" (UniqueName: \"kubernetes.io/projected/91169f16-5374-4988-bc5d-9332cc672d72-kube-api-access-vxprs\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.435864 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nlzww-config-vxsmg"] Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.447744 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nlzww-config-vxsmg"] Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.594359 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1c90db942badecfccc8d0398df2685a0888fa9550e509d2911bae8042c47c9c" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.594398 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nlzww-config-vxsmg" Nov 22 03:09:18 crc kubenswrapper[4922]: I1122 03:09:18.749596 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nlzww" Nov 22 03:09:19 crc kubenswrapper[4922]: I1122 03:09:19.312691 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91169f16-5374-4988-bc5d-9332cc672d72" path="/var/lib/kubelet/pods/91169f16-5374-4988-bc5d-9332cc672d72/volumes" Nov 22 03:09:19 crc kubenswrapper[4922]: I1122 03:09:19.711123 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 03:09:19 crc kubenswrapper[4922]: I1122 03:09:19.941127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.185225 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7lrkg"] Nov 22 03:09:20 crc kubenswrapper[4922]: E1122 03:09:20.185768 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91169f16-5374-4988-bc5d-9332cc672d72" containerName="ovn-config" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.185793 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="91169f16-5374-4988-bc5d-9332cc672d72" containerName="ovn-config" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.189893 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="91169f16-5374-4988-bc5d-9332cc672d72" containerName="ovn-config" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.190755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.202792 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7lrkg"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.271113 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qhvqh"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.272194 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.290837 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qhvqh"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.394546 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wddwm\" (UniqueName: \"kubernetes.io/projected/eb78fbe1-7a49-4c07-88cc-eb13d06d3723-kube-api-access-wddwm\") pod \"barbican-db-create-7lrkg\" (UID: \"eb78fbe1-7a49-4c07-88cc-eb13d06d3723\") " pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.394689 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptpks\" (UniqueName: \"kubernetes.io/projected/30790286-43e6-435e-9d57-a69b795cc1b5-kube-api-access-ptpks\") pod \"cinder-db-create-qhvqh\" (UID: \"30790286-43e6-435e-9d57-a69b795cc1b5\") " pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.474701 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bp5gq"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.476131 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.479932 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cbjdr" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.480116 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.480548 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.480893 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.490011 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bp5gq"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.497080 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp6ct\" (UniqueName: \"kubernetes.io/projected/56cc5718-880b-43d3-9f3a-2a418797cf1f-kube-api-access-hp6ct\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.497152 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-config-data\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.497178 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-combined-ca-bundle\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.497228 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wddwm\" (UniqueName: \"kubernetes.io/projected/eb78fbe1-7a49-4c07-88cc-eb13d06d3723-kube-api-access-wddwm\") pod \"barbican-db-create-7lrkg\" (UID: \"eb78fbe1-7a49-4c07-88cc-eb13d06d3723\") " pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.497276 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptpks\" (UniqueName: \"kubernetes.io/projected/30790286-43e6-435e-9d57-a69b795cc1b5-kube-api-access-ptpks\") pod \"cinder-db-create-qhvqh\" (UID: \"30790286-43e6-435e-9d57-a69b795cc1b5\") " pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.527432 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptpks\" (UniqueName: \"kubernetes.io/projected/30790286-43e6-435e-9d57-a69b795cc1b5-kube-api-access-ptpks\") pod \"cinder-db-create-qhvqh\" (UID: \"30790286-43e6-435e-9d57-a69b795cc1b5\") " pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.542198 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wddwm\" (UniqueName: \"kubernetes.io/projected/eb78fbe1-7a49-4c07-88cc-eb13d06d3723-kube-api-access-wddwm\") pod \"barbican-db-create-7lrkg\" (UID: \"eb78fbe1-7a49-4c07-88cc-eb13d06d3723\") " pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.596330 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5pcsk"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.597762 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp6ct\" (UniqueName: \"kubernetes.io/projected/56cc5718-880b-43d3-9f3a-2a418797cf1f-kube-api-access-hp6ct\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.597821 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-config-data\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.597852 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-combined-ca-bundle\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.598668 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.607467 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5pcsk"] Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.607528 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.609540 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-combined-ca-bundle\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.612721 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-config-data\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.629994 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp6ct\" (UniqueName: \"kubernetes.io/projected/56cc5718-880b-43d3-9f3a-2a418797cf1f-kube-api-access-hp6ct\") pod \"keystone-db-sync-bp5gq\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.799366 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.800332 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7qbv\" (UniqueName: \"kubernetes.io/projected/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a-kube-api-access-c7qbv\") pod \"neutron-db-create-5pcsk\" (UID: \"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a\") " pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.809638 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.902026 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7qbv\" (UniqueName: \"kubernetes.io/projected/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a-kube-api-access-c7qbv\") pod \"neutron-db-create-5pcsk\" (UID: \"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a\") " pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:20 crc kubenswrapper[4922]: I1122 03:09:20.920711 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7qbv\" (UniqueName: \"kubernetes.io/projected/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a-kube-api-access-c7qbv\") pod \"neutron-db-create-5pcsk\" (UID: \"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a\") " pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:21 crc kubenswrapper[4922]: I1122 03:09:21.040787 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:21 crc kubenswrapper[4922]: I1122 03:09:21.134359 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qhvqh"] Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.448674 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bp5gq"] Nov 22 03:09:28 crc kubenswrapper[4922]: W1122 03:09:28.455637 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56cc5718_880b_43d3_9f3a_2a418797cf1f.slice/crio-127653df9177c93e44e5b38023343329933f71899f2af5e525d1a3f7831802df WatchSource:0}: Error finding container 127653df9177c93e44e5b38023343329933f71899f2af5e525d1a3f7831802df: Status 404 returned error can't find the container with id 127653df9177c93e44e5b38023343329933f71899f2af5e525d1a3f7831802df Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.525392 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7lrkg"] Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.530776 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5pcsk"] Nov 22 03:09:28 crc kubenswrapper[4922]: W1122 03:09:28.535537 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb78fbe1_7a49_4c07_88cc_eb13d06d3723.slice/crio-d66dd38abced5fefa3ff6357e3764571d8b161713281f138d3ef4e1111891baf WatchSource:0}: Error finding container d66dd38abced5fefa3ff6357e3764571d8b161713281f138d3ef4e1111891baf: Status 404 returned error can't find the container with id d66dd38abced5fefa3ff6357e3764571d8b161713281f138d3ef4e1111891baf Nov 22 03:09:28 crc kubenswrapper[4922]: W1122 03:09:28.542836 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9cb77a8_897d_4f9a_9cb0_05d1a81e903a.slice/crio-4430cd289f8eb3c9c5e52c10c4d3a1eede4732c428bf8502d0725e65ee41a914 WatchSource:0}: Error finding container 4430cd289f8eb3c9c5e52c10c4d3a1eede4732c428bf8502d0725e65ee41a914: Status 404 returned error can't find the container with id 4430cd289f8eb3c9c5e52c10c4d3a1eede4732c428bf8502d0725e65ee41a914 Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.702016 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5pcsk" event={"ID":"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a","Type":"ContainerStarted","Data":"4430cd289f8eb3c9c5e52c10c4d3a1eede4732c428bf8502d0725e65ee41a914"} Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.703663 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7lrkg" event={"ID":"eb78fbe1-7a49-4c07-88cc-eb13d06d3723","Type":"ContainerStarted","Data":"d66dd38abced5fefa3ff6357e3764571d8b161713281f138d3ef4e1111891baf"} Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.705565 4922 generic.go:334] "Generic (PLEG): container finished" podID="30790286-43e6-435e-9d57-a69b795cc1b5" containerID="9bfa856b03cc6b8d7cbb4166791984228491a1e1796b4eb45193c2edd4ac51ea" exitCode=0 Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.705640 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qhvqh" event={"ID":"30790286-43e6-435e-9d57-a69b795cc1b5","Type":"ContainerDied","Data":"9bfa856b03cc6b8d7cbb4166791984228491a1e1796b4eb45193c2edd4ac51ea"} Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.705665 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qhvqh" event={"ID":"30790286-43e6-435e-9d57-a69b795cc1b5","Type":"ContainerStarted","Data":"dc98e36f2d99d1d27951829fbb2cad99f614baa789eb102ccb75f7b5a34ba686"} Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.707690 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wbr6m" event={"ID":"b763fe0e-98d2-4e23-8629-a14f68e3e8b8","Type":"ContainerStarted","Data":"64a9f8862fe435f716cbb7864894e6a62685c6f5fb328a4321795a1d45c5163b"} Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.709547 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bp5gq" event={"ID":"56cc5718-880b-43d3-9f3a-2a418797cf1f","Type":"ContainerStarted","Data":"127653df9177c93e44e5b38023343329933f71899f2af5e525d1a3f7831802df"} Nov 22 03:09:28 crc kubenswrapper[4922]: I1122 03:09:28.741929 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wbr6m" podStartSLOduration=2.345947894 podStartE2EDuration="15.741903456s" podCreationTimestamp="2025-11-22 03:09:13 +0000 UTC" firstStartedPulling="2025-11-22 03:09:14.61611728 +0000 UTC m=+990.654639172" lastFinishedPulling="2025-11-22 03:09:28.012072822 +0000 UTC m=+1004.050594734" observedRunningTime="2025-11-22 03:09:28.73707225 +0000 UTC m=+1004.775594172" watchObservedRunningTime="2025-11-22 03:09:28.741903456 +0000 UTC m=+1004.780425348" Nov 22 03:09:29 crc kubenswrapper[4922]: I1122 03:09:29.717328 4922 generic.go:334] "Generic (PLEG): container finished" podID="f9cb77a8-897d-4f9a-9cb0-05d1a81e903a" containerID="462eae1408c2e75c5b1f93d877c6d0cdf496380386ebcd7908aecd7b59c5d37b" exitCode=0 Nov 22 03:09:29 crc kubenswrapper[4922]: I1122 03:09:29.717632 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5pcsk" event={"ID":"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a","Type":"ContainerDied","Data":"462eae1408c2e75c5b1f93d877c6d0cdf496380386ebcd7908aecd7b59c5d37b"} Nov 22 03:09:29 crc kubenswrapper[4922]: I1122 03:09:29.719666 4922 generic.go:334] "Generic (PLEG): container finished" podID="eb78fbe1-7a49-4c07-88cc-eb13d06d3723" containerID="dd7717deb4676901623e27b92d3b402962399d118d9d436460639dfde979039b" exitCode=0 Nov 22 03:09:29 crc kubenswrapper[4922]: I1122 03:09:29.720524 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7lrkg" event={"ID":"eb78fbe1-7a49-4c07-88cc-eb13d06d3723","Type":"ContainerDied","Data":"dd7717deb4676901623e27b92d3b402962399d118d9d436460639dfde979039b"} Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.021584 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.201651 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptpks\" (UniqueName: \"kubernetes.io/projected/30790286-43e6-435e-9d57-a69b795cc1b5-kube-api-access-ptpks\") pod \"30790286-43e6-435e-9d57-a69b795cc1b5\" (UID: \"30790286-43e6-435e-9d57-a69b795cc1b5\") " Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.226025 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30790286-43e6-435e-9d57-a69b795cc1b5-kube-api-access-ptpks" (OuterVolumeSpecName: "kube-api-access-ptpks") pod "30790286-43e6-435e-9d57-a69b795cc1b5" (UID: "30790286-43e6-435e-9d57-a69b795cc1b5"). InnerVolumeSpecName "kube-api-access-ptpks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.310238 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptpks\" (UniqueName: \"kubernetes.io/projected/30790286-43e6-435e-9d57-a69b795cc1b5-kube-api-access-ptpks\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.729221 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qhvqh" event={"ID":"30790286-43e6-435e-9d57-a69b795cc1b5","Type":"ContainerDied","Data":"dc98e36f2d99d1d27951829fbb2cad99f614baa789eb102ccb75f7b5a34ba686"} Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.729266 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc98e36f2d99d1d27951829fbb2cad99f614baa789eb102ccb75f7b5a34ba686" Nov 22 03:09:30 crc kubenswrapper[4922]: I1122 03:09:30.729449 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qhvqh" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.188599 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.197353 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.266987 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wddwm\" (UniqueName: \"kubernetes.io/projected/eb78fbe1-7a49-4c07-88cc-eb13d06d3723-kube-api-access-wddwm\") pod \"eb78fbe1-7a49-4c07-88cc-eb13d06d3723\" (UID: \"eb78fbe1-7a49-4c07-88cc-eb13d06d3723\") " Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.267117 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7qbv\" (UniqueName: \"kubernetes.io/projected/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a-kube-api-access-c7qbv\") pod \"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a\" (UID: \"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a\") " Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.271990 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a-kube-api-access-c7qbv" (OuterVolumeSpecName: "kube-api-access-c7qbv") pod "f9cb77a8-897d-4f9a-9cb0-05d1a81e903a" (UID: "f9cb77a8-897d-4f9a-9cb0-05d1a81e903a"). InnerVolumeSpecName "kube-api-access-c7qbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.275061 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb78fbe1-7a49-4c07-88cc-eb13d06d3723-kube-api-access-wddwm" (OuterVolumeSpecName: "kube-api-access-wddwm") pod "eb78fbe1-7a49-4c07-88cc-eb13d06d3723" (UID: "eb78fbe1-7a49-4c07-88cc-eb13d06d3723"). InnerVolumeSpecName "kube-api-access-wddwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.368379 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wddwm\" (UniqueName: \"kubernetes.io/projected/eb78fbe1-7a49-4c07-88cc-eb13d06d3723-kube-api-access-wddwm\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.368408 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7qbv\" (UniqueName: \"kubernetes.io/projected/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a-kube-api-access-c7qbv\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.760011 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bp5gq" event={"ID":"56cc5718-880b-43d3-9f3a-2a418797cf1f","Type":"ContainerStarted","Data":"48cd9d37c55ad31a07001d2ba0b3bd54cc22b5f04ed56109cdd74c1c4fd1ce72"} Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.764537 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5pcsk" event={"ID":"f9cb77a8-897d-4f9a-9cb0-05d1a81e903a","Type":"ContainerDied","Data":"4430cd289f8eb3c9c5e52c10c4d3a1eede4732c428bf8502d0725e65ee41a914"} Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.764606 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4430cd289f8eb3c9c5e52c10c4d3a1eede4732c428bf8502d0725e65ee41a914" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.764720 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pcsk" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.767461 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7lrkg" event={"ID":"eb78fbe1-7a49-4c07-88cc-eb13d06d3723","Type":"ContainerDied","Data":"d66dd38abced5fefa3ff6357e3764571d8b161713281f138d3ef4e1111891baf"} Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.767492 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d66dd38abced5fefa3ff6357e3764571d8b161713281f138d3ef4e1111891baf" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.767522 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7lrkg" Nov 22 03:09:33 crc kubenswrapper[4922]: I1122 03:09:33.787486 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bp5gq" podStartSLOduration=9.031858352 podStartE2EDuration="13.787471847s" podCreationTimestamp="2025-11-22 03:09:20 +0000 UTC" firstStartedPulling="2025-11-22 03:09:28.4578013 +0000 UTC m=+1004.496323192" lastFinishedPulling="2025-11-22 03:09:33.213414785 +0000 UTC m=+1009.251936687" observedRunningTime="2025-11-22 03:09:33.781867812 +0000 UTC m=+1009.820389704" watchObservedRunningTime="2025-11-22 03:09:33.787471847 +0000 UTC m=+1009.825993739" Nov 22 03:09:35 crc kubenswrapper[4922]: I1122 03:09:35.785599 4922 generic.go:334] "Generic (PLEG): container finished" podID="b763fe0e-98d2-4e23-8629-a14f68e3e8b8" containerID="64a9f8862fe435f716cbb7864894e6a62685c6f5fb328a4321795a1d45c5163b" exitCode=0 Nov 22 03:09:35 crc kubenswrapper[4922]: I1122 03:09:35.785674 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wbr6m" event={"ID":"b763fe0e-98d2-4e23-8629-a14f68e3e8b8","Type":"ContainerDied","Data":"64a9f8862fe435f716cbb7864894e6a62685c6f5fb328a4321795a1d45c5163b"} Nov 22 03:09:36 crc kubenswrapper[4922]: I1122 03:09:36.798599 4922 generic.go:334] "Generic (PLEG): container finished" podID="56cc5718-880b-43d3-9f3a-2a418797cf1f" containerID="48cd9d37c55ad31a07001d2ba0b3bd54cc22b5f04ed56109cdd74c1c4fd1ce72" exitCode=0 Nov 22 03:09:36 crc kubenswrapper[4922]: I1122 03:09:36.798694 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bp5gq" event={"ID":"56cc5718-880b-43d3-9f3a-2a418797cf1f","Type":"ContainerDied","Data":"48cd9d37c55ad31a07001d2ba0b3bd54cc22b5f04ed56109cdd74c1c4fd1ce72"} Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.246134 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.252602 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-db-sync-config-data\") pod \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.252682 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-config-data\") pod \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.252935 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckd56\" (UniqueName: \"kubernetes.io/projected/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-kube-api-access-ckd56\") pod \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.252976 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-combined-ca-bundle\") pod \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\" (UID: \"b763fe0e-98d2-4e23-8629-a14f68e3e8b8\") " Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.260008 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b763fe0e-98d2-4e23-8629-a14f68e3e8b8" (UID: "b763fe0e-98d2-4e23-8629-a14f68e3e8b8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.261535 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-kube-api-access-ckd56" (OuterVolumeSpecName: "kube-api-access-ckd56") pod "b763fe0e-98d2-4e23-8629-a14f68e3e8b8" (UID: "b763fe0e-98d2-4e23-8629-a14f68e3e8b8"). InnerVolumeSpecName "kube-api-access-ckd56". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.309967 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b763fe0e-98d2-4e23-8629-a14f68e3e8b8" (UID: "b763fe0e-98d2-4e23-8629-a14f68e3e8b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.333563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-config-data" (OuterVolumeSpecName: "config-data") pod "b763fe0e-98d2-4e23-8629-a14f68e3e8b8" (UID: "b763fe0e-98d2-4e23-8629-a14f68e3e8b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.355300 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckd56\" (UniqueName: \"kubernetes.io/projected/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-kube-api-access-ckd56\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.355348 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.355364 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.355380 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b763fe0e-98d2-4e23-8629-a14f68e3e8b8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.812140 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wbr6m" event={"ID":"b763fe0e-98d2-4e23-8629-a14f68e3e8b8","Type":"ContainerDied","Data":"27fe60125af8fb4a8eb09fb1e5ea568d6dde611813ff6cec282f2f331f129bfa"} Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.812200 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27fe60125af8fb4a8eb09fb1e5ea568d6dde611813ff6cec282f2f331f129bfa" Nov 22 03:09:37 crc kubenswrapper[4922]: I1122 03:09:37.812161 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wbr6m" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.107120 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.271938 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-combined-ca-bundle\") pod \"56cc5718-880b-43d3-9f3a-2a418797cf1f\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.272012 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-config-data\") pod \"56cc5718-880b-43d3-9f3a-2a418797cf1f\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.272100 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp6ct\" (UniqueName: \"kubernetes.io/projected/56cc5718-880b-43d3-9f3a-2a418797cf1f-kube-api-access-hp6ct\") pod \"56cc5718-880b-43d3-9f3a-2a418797cf1f\" (UID: \"56cc5718-880b-43d3-9f3a-2a418797cf1f\") " Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.290713 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56cc5718-880b-43d3-9f3a-2a418797cf1f-kube-api-access-hp6ct" (OuterVolumeSpecName: "kube-api-access-hp6ct") pod "56cc5718-880b-43d3-9f3a-2a418797cf1f" (UID: "56cc5718-880b-43d3-9f3a-2a418797cf1f"). InnerVolumeSpecName "kube-api-access-hp6ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.307681 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-f2ssv"] Nov 22 03:09:38 crc kubenswrapper[4922]: E1122 03:09:38.308015 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30790286-43e6-435e-9d57-a69b795cc1b5" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308029 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30790286-43e6-435e-9d57-a69b795cc1b5" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: E1122 03:09:38.308051 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b763fe0e-98d2-4e23-8629-a14f68e3e8b8" containerName="glance-db-sync" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308058 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b763fe0e-98d2-4e23-8629-a14f68e3e8b8" containerName="glance-db-sync" Nov 22 03:09:38 crc kubenswrapper[4922]: E1122 03:09:38.308067 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56cc5718-880b-43d3-9f3a-2a418797cf1f" containerName="keystone-db-sync" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308073 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="56cc5718-880b-43d3-9f3a-2a418797cf1f" containerName="keystone-db-sync" Nov 22 03:09:38 crc kubenswrapper[4922]: E1122 03:09:38.308084 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cb77a8-897d-4f9a-9cb0-05d1a81e903a" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308090 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cb77a8-897d-4f9a-9cb0-05d1a81e903a" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: E1122 03:09:38.308104 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb78fbe1-7a49-4c07-88cc-eb13d06d3723" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308111 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb78fbe1-7a49-4c07-88cc-eb13d06d3723" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308256 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9cb77a8-897d-4f9a-9cb0-05d1a81e903a" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308272 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="56cc5718-880b-43d3-9f3a-2a418797cf1f" containerName="keystone-db-sync" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308283 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30790286-43e6-435e-9d57-a69b795cc1b5" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308292 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb78fbe1-7a49-4c07-88cc-eb13d06d3723" containerName="mariadb-database-create" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.308301 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b763fe0e-98d2-4e23-8629-a14f68e3e8b8" containerName="glance-db-sync" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.309125 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.318321 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56cc5718-880b-43d3-9f3a-2a418797cf1f" (UID: "56cc5718-880b-43d3-9f3a-2a418797cf1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.329748 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-f2ssv"] Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.351602 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-config-data" (OuterVolumeSpecName: "config-data") pod "56cc5718-880b-43d3-9f3a-2a418797cf1f" (UID: "56cc5718-880b-43d3-9f3a-2a418797cf1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.376743 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.376803 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-config\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.376886 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.376995 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.377096 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkppn\" (UniqueName: \"kubernetes.io/projected/019d6d6f-c2d5-4b02-b771-7c011fba99ba-kube-api-access-jkppn\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.377161 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.377174 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56cc5718-880b-43d3-9f3a-2a418797cf1f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.377184 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp6ct\" (UniqueName: \"kubernetes.io/projected/56cc5718-880b-43d3-9f3a-2a418797cf1f-kube-api-access-hp6ct\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.477574 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.477628 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.477664 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkppn\" (UniqueName: \"kubernetes.io/projected/019d6d6f-c2d5-4b02-b771-7c011fba99ba-kube-api-access-jkppn\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.477695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.477720 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-config\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.478691 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-config\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.478780 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.478830 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.478782 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.496777 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkppn\" (UniqueName: \"kubernetes.io/projected/019d6d6f-c2d5-4b02-b771-7c011fba99ba-kube-api-access-jkppn\") pod \"dnsmasq-dns-54f9b7b8d9-f2ssv\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.719310 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.840754 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bp5gq" event={"ID":"56cc5718-880b-43d3-9f3a-2a418797cf1f","Type":"ContainerDied","Data":"127653df9177c93e44e5b38023343329933f71899f2af5e525d1a3f7831802df"} Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.840814 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127653df9177c93e44e5b38023343329933f71899f2af5e525d1a3f7831802df" Nov 22 03:09:38 crc kubenswrapper[4922]: I1122 03:09:38.840913 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bp5gq" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.108456 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-f2ssv"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.179354 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jr8pw"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.181431 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.190401 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.190857 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cbjdr" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.191521 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.191711 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.200974 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-vc7j2"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.204472 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.223324 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jr8pw"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.305738 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-credential-keys\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.305904 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvgz\" (UniqueName: \"kubernetes.io/projected/836ddb92-3947-4955-8dc6-ec8e96357ca4-kube-api-access-7nvgz\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.305968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-config-data\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.306018 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-combined-ca-bundle\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.306065 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-scripts\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.306093 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-fernet-keys\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.386382 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-vc7j2"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410142 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-credential-keys\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvgz\" (UniqueName: \"kubernetes.io/projected/836ddb92-3947-4955-8dc6-ec8e96357ca4-kube-api-access-7nvgz\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410271 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdbj\" (UniqueName: \"kubernetes.io/projected/e5f20a52-548e-471f-9db9-92056b41111d-kube-api-access-bgdbj\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410296 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410312 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410347 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-config-data\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410370 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-dns-svc\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410400 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-combined-ca-bundle\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410430 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-scripts\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410451 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-fernet-keys\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.410546 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-config\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.416646 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-f2ssv"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.427910 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-scripts\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.446554 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-config-data\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.447320 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-fernet-keys\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.459061 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-combined-ca-bundle\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.471486 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-credential-keys\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.473802 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvgz\" (UniqueName: \"kubernetes.io/projected/836ddb92-3947-4955-8dc6-ec8e96357ca4-kube-api-access-7nvgz\") pod \"keystone-bootstrap-jr8pw\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.517495 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-dns-svc\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.517672 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-config\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.517741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdbj\" (UniqueName: \"kubernetes.io/projected/e5f20a52-548e-471f-9db9-92056b41111d-kube-api-access-bgdbj\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.517765 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.517787 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.518656 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.519581 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-dns-svc\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.528803 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-config\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.544735 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.571368 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdbj\" (UniqueName: \"kubernetes.io/projected/e5f20a52-548e-471f-9db9-92056b41111d-kube-api-access-bgdbj\") pod \"dnsmasq-dns-6546db6db7-vc7j2\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.661308 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-vc7j2"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.661862 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.668132 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-whfxs"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.680148 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.689090 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.689457 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m5xtt" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.689639 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.689954 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-whfxs"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.693914 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.696565 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.697459 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.698956 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.699142 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.721430 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-ml4m5"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.723558 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.736920 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.749230 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-ml4m5"] Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.825968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-scripts\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826063 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826090 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826127 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826168 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-run-httpd\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826192 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfw7j\" (UniqueName: \"kubernetes.io/projected/9257c931-7544-4f45-9589-e735e55bcca2-kube-api-access-gfw7j\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826233 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-config\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826252 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-log-httpd\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826325 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeeab218-54fe-4892-b3f8-60b166ad72e2-logs\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826343 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826376 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-scripts\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826406 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-combined-ca-bundle\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826434 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnnlp\" (UniqueName: \"kubernetes.io/projected/aeeab218-54fe-4892-b3f8-60b166ad72e2-kube-api-access-qnnlp\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-config-data\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826498 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826520 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-config-data\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.826711 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltr8\" (UniqueName: \"kubernetes.io/projected/e99d940a-73cb-41bb-b5a9-7003544b002a-kube-api-access-4ltr8\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.857737 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" event={"ID":"019d6d6f-c2d5-4b02-b771-7c011fba99ba","Type":"ContainerStarted","Data":"72a026de3e4f2620cabaa80e6e4133fb45dc9a7546d76fe03429f19a342c0174"} Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929015 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-combined-ca-bundle\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929107 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnnlp\" (UniqueName: \"kubernetes.io/projected/aeeab218-54fe-4892-b3f8-60b166ad72e2-kube-api-access-qnnlp\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929153 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-config-data\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929192 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929213 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-config-data\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929233 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltr8\" (UniqueName: \"kubernetes.io/projected/e99d940a-73cb-41bb-b5a9-7003544b002a-kube-api-access-4ltr8\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929273 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-scripts\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929294 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929313 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929333 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929368 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-run-httpd\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929387 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfw7j\" (UniqueName: \"kubernetes.io/projected/9257c931-7544-4f45-9589-e735e55bcca2-kube-api-access-gfw7j\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929409 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-config\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929427 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-log-httpd\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929483 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeeab218-54fe-4892-b3f8-60b166ad72e2-logs\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929508 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.929537 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-scripts\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.933186 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-run-httpd\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.934031 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-config\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.934952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeeab218-54fe-4892-b3f8-60b166ad72e2-logs\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.935648 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.936042 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.936742 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-log-httpd\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.937358 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.945012 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-config-data\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.946665 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-scripts\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.946928 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-scripts\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.947306 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-config-data\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.947396 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.948508 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-combined-ca-bundle\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.949491 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnnlp\" (UniqueName: \"kubernetes.io/projected/aeeab218-54fe-4892-b3f8-60b166ad72e2-kube-api-access-qnnlp\") pod \"placement-db-sync-whfxs\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.955620 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltr8\" (UniqueName: \"kubernetes.io/projected/e99d940a-73cb-41bb-b5a9-7003544b002a-kube-api-access-4ltr8\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.955860 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " pod="openstack/ceilometer-0" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.959574 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfw7j\" (UniqueName: \"kubernetes.io/projected/9257c931-7544-4f45-9589-e735e55bcca2-kube-api-access-gfw7j\") pod \"dnsmasq-dns-7987f74bbc-ml4m5\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:39 crc kubenswrapper[4922]: I1122 03:09:39.990100 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-vc7j2"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.108676 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-whfxs" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.118902 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jr8pw"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.123079 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:09:40 crc kubenswrapper[4922]: W1122 03:09:40.131558 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod836ddb92_3947_4955_8dc6_ec8e96357ca4.slice/crio-e086a7752b43b5dd8f6d3383728a2dafe45a74016e3d0bc07dd94b78921161f6 WatchSource:0}: Error finding container e086a7752b43b5dd8f6d3383728a2dafe45a74016e3d0bc07dd94b78921161f6: Status 404 returned error can't find the container with id e086a7752b43b5dd8f6d3383728a2dafe45a74016e3d0bc07dd94b78921161f6 Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.155479 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.246380 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-75a2-account-create-wwhfk"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.249000 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.251363 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.275748 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-75a2-account-create-wwhfk"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.338959 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbhmj\" (UniqueName: \"kubernetes.io/projected/f79ad5eb-fe16-4595-a535-664d15aba98a-kube-api-access-nbhmj\") pod \"barbican-75a2-account-create-wwhfk\" (UID: \"f79ad5eb-fe16-4595-a535-664d15aba98a\") " pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.441124 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-885b-account-create-79txk"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.442830 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.446496 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.446970 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbhmj\" (UniqueName: \"kubernetes.io/projected/f79ad5eb-fe16-4595-a535-664d15aba98a-kube-api-access-nbhmj\") pod \"barbican-75a2-account-create-wwhfk\" (UID: \"f79ad5eb-fe16-4595-a535-664d15aba98a\") " pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.457837 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-885b-account-create-79txk"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.467945 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbhmj\" (UniqueName: \"kubernetes.io/projected/f79ad5eb-fe16-4595-a535-664d15aba98a-kube-api-access-nbhmj\") pod \"barbican-75a2-account-create-wwhfk\" (UID: \"f79ad5eb-fe16-4595-a535-664d15aba98a\") " pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.557778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqn6\" (UniqueName: \"kubernetes.io/projected/46d2f699-5508-477b-8f5f-a80feb9a10b3-kube-api-access-tjqn6\") pod \"cinder-885b-account-create-79txk\" (UID: \"46d2f699-5508-477b-8f5f-a80feb9a10b3\") " pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.638555 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d495-account-create-b2mz9"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.640414 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.644713 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d495-account-create-b2mz9"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.644796 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.662562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqn6\" (UniqueName: \"kubernetes.io/projected/46d2f699-5508-477b-8f5f-a80feb9a10b3-kube-api-access-tjqn6\") pod \"cinder-885b-account-create-79txk\" (UID: \"46d2f699-5508-477b-8f5f-a80feb9a10b3\") " pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.681079 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqn6\" (UniqueName: \"kubernetes.io/projected/46d2f699-5508-477b-8f5f-a80feb9a10b3-kube-api-access-tjqn6\") pod \"cinder-885b-account-create-79txk\" (UID: \"46d2f699-5508-477b-8f5f-a80feb9a10b3\") " pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.696005 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-whfxs"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.704378 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.765196 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.765462 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8b5m\" (UniqueName: \"kubernetes.io/projected/58f2e3f8-0b26-4534-a01c-f261d5048821-kube-api-access-g8b5m\") pod \"neutron-d495-account-create-b2mz9\" (UID: \"58f2e3f8-0b26-4534-a01c-f261d5048821\") " pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.796463 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-ml4m5"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.867187 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8b5m\" (UniqueName: \"kubernetes.io/projected/58f2e3f8-0b26-4534-a01c-f261d5048821-kube-api-access-g8b5m\") pod \"neutron-d495-account-create-b2mz9\" (UID: \"58f2e3f8-0b26-4534-a01c-f261d5048821\") " pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.868913 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.899005 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-whfxs" event={"ID":"aeeab218-54fe-4892-b3f8-60b166ad72e2","Type":"ContainerStarted","Data":"a6965cc16c80b4bea051c177cfd66dbd2ebe5a52e45092dd58794f1596534ec8"} Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.904724 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8b5m\" (UniqueName: \"kubernetes.io/projected/58f2e3f8-0b26-4534-a01c-f261d5048821-kube-api-access-g8b5m\") pod \"neutron-d495-account-create-b2mz9\" (UID: \"58f2e3f8-0b26-4534-a01c-f261d5048821\") " pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.913144 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" event={"ID":"e5f20a52-548e-471f-9db9-92056b41111d","Type":"ContainerStarted","Data":"0a955fc8daefbb728d583a40237dba0cbf2aba4cf0d1cd4c6407c9703e0ac76e"} Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.917461 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" event={"ID":"9257c931-7544-4f45-9589-e735e55bcca2","Type":"ContainerStarted","Data":"2f8e55a16367ecd0b3b86635bc6ceb18369593821adf8bc6ffe259febb45bfe2"} Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.922083 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr8pw" event={"ID":"836ddb92-3947-4955-8dc6-ec8e96357ca4","Type":"ContainerStarted","Data":"e086a7752b43b5dd8f6d3383728a2dafe45a74016e3d0bc07dd94b78921161f6"} Nov 22 03:09:40 crc kubenswrapper[4922]: I1122 03:09:40.966046 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.198184 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-75a2-account-create-wwhfk"] Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.244937 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d495-account-create-b2mz9"] Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.343650 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-885b-account-create-79txk"] Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.936415 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75a2-account-create-wwhfk" event={"ID":"f79ad5eb-fe16-4595-a535-664d15aba98a","Type":"ContainerStarted","Data":"288976e74ca2c3550006f658fcefe8de42658848f7d338c9a84c9282bad34029"} Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.938471 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerStarted","Data":"f4b55afc81873f7cf24d4621dba6aab220be2ea33006f7a2e9ba4ffbc2fc6d2b"} Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.940108 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-885b-account-create-79txk" event={"ID":"46d2f699-5508-477b-8f5f-a80feb9a10b3","Type":"ContainerStarted","Data":"9f55c99e30b4df9bec47d97a42f983ba3b4199efef0c55749868825cc3a7252c"} Nov 22 03:09:41 crc kubenswrapper[4922]: I1122 03:09:41.941894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d495-account-create-b2mz9" event={"ID":"58f2e3f8-0b26-4534-a01c-f261d5048821","Type":"ContainerStarted","Data":"f03af7b6c50404661ef3af4ec03087476ba00a7e5d06dad3fb66bf33e8aba1d9"} Nov 22 03:09:42 crc kubenswrapper[4922]: I1122 03:09:42.877022 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:09:45 crc kubenswrapper[4922]: I1122 03:09:45.990024 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" event={"ID":"019d6d6f-c2d5-4b02-b771-7c011fba99ba","Type":"ContainerStarted","Data":"411f9bcedca861b591889873ad5b6655f40d4b043ca5cdb5cca2afba9eaf1ba9"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.003346 4922 generic.go:334] "Generic (PLEG): container finished" podID="58f2e3f8-0b26-4534-a01c-f261d5048821" containerID="733df6deffe6c0f591959831fef98d9c16b0695ce44cb6c1dab2f05e439ad628" exitCode=0 Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.003418 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d495-account-create-b2mz9" event={"ID":"58f2e3f8-0b26-4534-a01c-f261d5048821","Type":"ContainerDied","Data":"733df6deffe6c0f591959831fef98d9c16b0695ce44cb6c1dab2f05e439ad628"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.021257 4922 generic.go:334] "Generic (PLEG): container finished" podID="9257c931-7544-4f45-9589-e735e55bcca2" containerID="b87b0217af83e867eaf23471484bd500ff4aad293958784ad10e7484611da195" exitCode=0 Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.021370 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" event={"ID":"9257c931-7544-4f45-9589-e735e55bcca2","Type":"ContainerDied","Data":"b87b0217af83e867eaf23471484bd500ff4aad293958784ad10e7484611da195"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.025900 4922 generic.go:334] "Generic (PLEG): container finished" podID="f79ad5eb-fe16-4595-a535-664d15aba98a" containerID="e7d50cbb939cbca00d444abe24b5d0f1baec9dd1c77c8a4efd56beeed4c8fc1a" exitCode=0 Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.025975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75a2-account-create-wwhfk" event={"ID":"f79ad5eb-fe16-4595-a535-664d15aba98a","Type":"ContainerDied","Data":"e7d50cbb939cbca00d444abe24b5d0f1baec9dd1c77c8a4efd56beeed4c8fc1a"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.035534 4922 generic.go:334] "Generic (PLEG): container finished" podID="019d6d6f-c2d5-4b02-b771-7c011fba99ba" containerID="411f9bcedca861b591889873ad5b6655f40d4b043ca5cdb5cca2afba9eaf1ba9" exitCode=0 Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.035803 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" event={"ID":"019d6d6f-c2d5-4b02-b771-7c011fba99ba","Type":"ContainerDied","Data":"411f9bcedca861b591889873ad5b6655f40d4b043ca5cdb5cca2afba9eaf1ba9"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.042770 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr8pw" event={"ID":"836ddb92-3947-4955-8dc6-ec8e96357ca4","Type":"ContainerStarted","Data":"f6ed80e0dd0701393dc9edbf59d065ba69eed83068bd4025f0a2bddc44448c09"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.050809 4922 generic.go:334] "Generic (PLEG): container finished" podID="e5f20a52-548e-471f-9db9-92056b41111d" containerID="ad06a6adb84ae17327efbc0131bd3c12610aad2581f3984ef3a50f371b1b8287" exitCode=0 Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.051005 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" event={"ID":"e5f20a52-548e-471f-9db9-92056b41111d","Type":"ContainerDied","Data":"ad06a6adb84ae17327efbc0131bd3c12610aad2581f3984ef3a50f371b1b8287"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.058257 4922 generic.go:334] "Generic (PLEG): container finished" podID="46d2f699-5508-477b-8f5f-a80feb9a10b3" containerID="fd4a3511c40b414c022fc64007fb1bab9ce0ba88acb96abb737da96793fa8c60" exitCode=0 Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.058324 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-885b-account-create-79txk" event={"ID":"46d2f699-5508-477b-8f5f-a80feb9a10b3","Type":"ContainerDied","Data":"fd4a3511c40b414c022fc64007fb1bab9ce0ba88acb96abb737da96793fa8c60"} Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.111987 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jr8pw" podStartSLOduration=8.111966878 podStartE2EDuration="8.111966878s" podCreationTimestamp="2025-11-22 03:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:47.093387292 +0000 UTC m=+1023.131909184" watchObservedRunningTime="2025-11-22 03:09:47.111966878 +0000 UTC m=+1023.150488770" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.523283 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.532023 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628636 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-dns-svc\") pod \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628682 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgdbj\" (UniqueName: \"kubernetes.io/projected/e5f20a52-548e-471f-9db9-92056b41111d-kube-api-access-bgdbj\") pod \"e5f20a52-548e-471f-9db9-92056b41111d\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628756 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-sb\") pod \"e5f20a52-548e-471f-9db9-92056b41111d\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628802 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkppn\" (UniqueName: \"kubernetes.io/projected/019d6d6f-c2d5-4b02-b771-7c011fba99ba-kube-api-access-jkppn\") pod \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628860 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-config\") pod \"e5f20a52-548e-471f-9db9-92056b41111d\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628888 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-dns-svc\") pod \"e5f20a52-548e-471f-9db9-92056b41111d\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.628946 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-config\") pod \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.629014 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-sb\") pod \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.629036 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-nb\") pod \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\" (UID: \"019d6d6f-c2d5-4b02-b771-7c011fba99ba\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.629061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-nb\") pod \"e5f20a52-548e-471f-9db9-92056b41111d\" (UID: \"e5f20a52-548e-471f-9db9-92056b41111d\") " Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.637117 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019d6d6f-c2d5-4b02-b771-7c011fba99ba-kube-api-access-jkppn" (OuterVolumeSpecName: "kube-api-access-jkppn") pod "019d6d6f-c2d5-4b02-b771-7c011fba99ba" (UID: "019d6d6f-c2d5-4b02-b771-7c011fba99ba"). InnerVolumeSpecName "kube-api-access-jkppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.640047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f20a52-548e-471f-9db9-92056b41111d-kube-api-access-bgdbj" (OuterVolumeSpecName: "kube-api-access-bgdbj") pod "e5f20a52-548e-471f-9db9-92056b41111d" (UID: "e5f20a52-548e-471f-9db9-92056b41111d"). InnerVolumeSpecName "kube-api-access-bgdbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.653667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5f20a52-548e-471f-9db9-92056b41111d" (UID: "e5f20a52-548e-471f-9db9-92056b41111d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.654026 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "019d6d6f-c2d5-4b02-b771-7c011fba99ba" (UID: "019d6d6f-c2d5-4b02-b771-7c011fba99ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.655382 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "019d6d6f-c2d5-4b02-b771-7c011fba99ba" (UID: "019d6d6f-c2d5-4b02-b771-7c011fba99ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.663816 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5f20a52-548e-471f-9db9-92056b41111d" (UID: "e5f20a52-548e-471f-9db9-92056b41111d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.668521 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-config" (OuterVolumeSpecName: "config") pod "e5f20a52-548e-471f-9db9-92056b41111d" (UID: "e5f20a52-548e-471f-9db9-92056b41111d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.669621 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-config" (OuterVolumeSpecName: "config") pod "019d6d6f-c2d5-4b02-b771-7c011fba99ba" (UID: "019d6d6f-c2d5-4b02-b771-7c011fba99ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.677796 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5f20a52-548e-471f-9db9-92056b41111d" (UID: "e5f20a52-548e-471f-9db9-92056b41111d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.679247 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "019d6d6f-c2d5-4b02-b771-7c011fba99ba" (UID: "019d6d6f-c2d5-4b02-b771-7c011fba99ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731388 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731434 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkppn\" (UniqueName: \"kubernetes.io/projected/019d6d6f-c2d5-4b02-b771-7c011fba99ba-kube-api-access-jkppn\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731454 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731464 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731478 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731492 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731501 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731510 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5f20a52-548e-471f-9db9-92056b41111d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731519 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/019d6d6f-c2d5-4b02-b771-7c011fba99ba-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:47 crc kubenswrapper[4922]: I1122 03:09:47.731529 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgdbj\" (UniqueName: \"kubernetes.io/projected/e5f20a52-548e-471f-9db9-92056b41111d-kube-api-access-bgdbj\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.073704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" event={"ID":"e5f20a52-548e-471f-9db9-92056b41111d","Type":"ContainerDied","Data":"0a955fc8daefbb728d583a40237dba0cbf2aba4cf0d1cd4c6407c9703e0ac76e"} Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.073779 4922 scope.go:117] "RemoveContainer" containerID="ad06a6adb84ae17327efbc0131bd3c12610aad2581f3984ef3a50f371b1b8287" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.074002 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-vc7j2" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.085487 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" event={"ID":"9257c931-7544-4f45-9589-e735e55bcca2","Type":"ContainerStarted","Data":"e409962e9a0bec937cd7dcaf7f7f301719dfeb171b5ae9dea8c6f1369628d036"} Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.085651 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.088033 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.091002 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-f2ssv" event={"ID":"019d6d6f-c2d5-4b02-b771-7c011fba99ba","Type":"ContainerDied","Data":"72a026de3e4f2620cabaa80e6e4133fb45dc9a7546d76fe03429f19a342c0174"} Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.135596 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" podStartSLOduration=9.13555115 podStartE2EDuration="9.13555115s" podCreationTimestamp="2025-11-22 03:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:48.114992887 +0000 UTC m=+1024.153514779" watchObservedRunningTime="2025-11-22 03:09:48.13555115 +0000 UTC m=+1024.174073052" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.229223 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-f2ssv"] Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.237505 4922 scope.go:117] "RemoveContainer" containerID="411f9bcedca861b591889873ad5b6655f40d4b043ca5cdb5cca2afba9eaf1ba9" Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.245621 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-f2ssv"] Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.341627 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-vc7j2"] Nov 22 03:09:48 crc kubenswrapper[4922]: I1122 03:09:48.363409 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-vc7j2"] Nov 22 03:09:49 crc kubenswrapper[4922]: I1122 03:09:49.320623 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019d6d6f-c2d5-4b02-b771-7c011fba99ba" path="/var/lib/kubelet/pods/019d6d6f-c2d5-4b02-b771-7c011fba99ba/volumes" Nov 22 03:09:49 crc kubenswrapper[4922]: I1122 03:09:49.322336 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f20a52-548e-471f-9db9-92056b41111d" path="/var/lib/kubelet/pods/e5f20a52-548e-471f-9db9-92056b41111d/volumes" Nov 22 03:09:50 crc kubenswrapper[4922]: I1122 03:09:50.113553 4922 generic.go:334] "Generic (PLEG): container finished" podID="836ddb92-3947-4955-8dc6-ec8e96357ca4" containerID="f6ed80e0dd0701393dc9edbf59d065ba69eed83068bd4025f0a2bddc44448c09" exitCode=0 Nov 22 03:09:50 crc kubenswrapper[4922]: I1122 03:09:50.113642 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr8pw" event={"ID":"836ddb92-3947-4955-8dc6-ec8e96357ca4","Type":"ContainerDied","Data":"f6ed80e0dd0701393dc9edbf59d065ba69eed83068bd4025f0a2bddc44448c09"} Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.521530 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.530752 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.547371 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.557893 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640192 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-fernet-keys\") pod \"836ddb92-3947-4955-8dc6-ec8e96357ca4\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640298 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nvgz\" (UniqueName: \"kubernetes.io/projected/836ddb92-3947-4955-8dc6-ec8e96357ca4-kube-api-access-7nvgz\") pod \"836ddb92-3947-4955-8dc6-ec8e96357ca4\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640341 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8b5m\" (UniqueName: \"kubernetes.io/projected/58f2e3f8-0b26-4534-a01c-f261d5048821-kube-api-access-g8b5m\") pod \"58f2e3f8-0b26-4534-a01c-f261d5048821\" (UID: \"58f2e3f8-0b26-4534-a01c-f261d5048821\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640365 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-credential-keys\") pod \"836ddb92-3947-4955-8dc6-ec8e96357ca4\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640385 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqn6\" (UniqueName: \"kubernetes.io/projected/46d2f699-5508-477b-8f5f-a80feb9a10b3-kube-api-access-tjqn6\") pod \"46d2f699-5508-477b-8f5f-a80feb9a10b3\" (UID: \"46d2f699-5508-477b-8f5f-a80feb9a10b3\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640411 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-combined-ca-bundle\") pod \"836ddb92-3947-4955-8dc6-ec8e96357ca4\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640514 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-config-data\") pod \"836ddb92-3947-4955-8dc6-ec8e96357ca4\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640541 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbhmj\" (UniqueName: \"kubernetes.io/projected/f79ad5eb-fe16-4595-a535-664d15aba98a-kube-api-access-nbhmj\") pod \"f79ad5eb-fe16-4595-a535-664d15aba98a\" (UID: \"f79ad5eb-fe16-4595-a535-664d15aba98a\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.640564 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-scripts\") pod \"836ddb92-3947-4955-8dc6-ec8e96357ca4\" (UID: \"836ddb92-3947-4955-8dc6-ec8e96357ca4\") " Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.648032 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "836ddb92-3947-4955-8dc6-ec8e96357ca4" (UID: "836ddb92-3947-4955-8dc6-ec8e96357ca4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.648523 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836ddb92-3947-4955-8dc6-ec8e96357ca4-kube-api-access-7nvgz" (OuterVolumeSpecName: "kube-api-access-7nvgz") pod "836ddb92-3947-4955-8dc6-ec8e96357ca4" (UID: "836ddb92-3947-4955-8dc6-ec8e96357ca4"). InnerVolumeSpecName "kube-api-access-7nvgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.649093 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f2e3f8-0b26-4534-a01c-f261d5048821-kube-api-access-g8b5m" (OuterVolumeSpecName: "kube-api-access-g8b5m") pod "58f2e3f8-0b26-4534-a01c-f261d5048821" (UID: "58f2e3f8-0b26-4534-a01c-f261d5048821"). InnerVolumeSpecName "kube-api-access-g8b5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.651010 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-scripts" (OuterVolumeSpecName: "scripts") pod "836ddb92-3947-4955-8dc6-ec8e96357ca4" (UID: "836ddb92-3947-4955-8dc6-ec8e96357ca4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.652508 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "836ddb92-3947-4955-8dc6-ec8e96357ca4" (UID: "836ddb92-3947-4955-8dc6-ec8e96357ca4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.652667 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d2f699-5508-477b-8f5f-a80feb9a10b3-kube-api-access-tjqn6" (OuterVolumeSpecName: "kube-api-access-tjqn6") pod "46d2f699-5508-477b-8f5f-a80feb9a10b3" (UID: "46d2f699-5508-477b-8f5f-a80feb9a10b3"). InnerVolumeSpecName "kube-api-access-tjqn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.657908 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79ad5eb-fe16-4595-a535-664d15aba98a-kube-api-access-nbhmj" (OuterVolumeSpecName: "kube-api-access-nbhmj") pod "f79ad5eb-fe16-4595-a535-664d15aba98a" (UID: "f79ad5eb-fe16-4595-a535-664d15aba98a"). InnerVolumeSpecName "kube-api-access-nbhmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.676069 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "836ddb92-3947-4955-8dc6-ec8e96357ca4" (UID: "836ddb92-3947-4955-8dc6-ec8e96357ca4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.678869 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-config-data" (OuterVolumeSpecName: "config-data") pod "836ddb92-3947-4955-8dc6-ec8e96357ca4" (UID: "836ddb92-3947-4955-8dc6-ec8e96357ca4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742561 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742598 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nvgz\" (UniqueName: \"kubernetes.io/projected/836ddb92-3947-4955-8dc6-ec8e96357ca4-kube-api-access-7nvgz\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742613 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8b5m\" (UniqueName: \"kubernetes.io/projected/58f2e3f8-0b26-4534-a01c-f261d5048821-kube-api-access-g8b5m\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742621 4922 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742631 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjqn6\" (UniqueName: \"kubernetes.io/projected/46d2f699-5508-477b-8f5f-a80feb9a10b3-kube-api-access-tjqn6\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742639 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742649 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742660 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbhmj\" (UniqueName: \"kubernetes.io/projected/f79ad5eb-fe16-4595-a535-664d15aba98a-kube-api-access-nbhmj\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:52 crc kubenswrapper[4922]: I1122 03:09:52.742668 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/836ddb92-3947-4955-8dc6-ec8e96357ca4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.148078 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d495-account-create-b2mz9" event={"ID":"58f2e3f8-0b26-4534-a01c-f261d5048821","Type":"ContainerDied","Data":"f03af7b6c50404661ef3af4ec03087476ba00a7e5d06dad3fb66bf33e8aba1d9"} Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.148133 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f03af7b6c50404661ef3af4ec03087476ba00a7e5d06dad3fb66bf33e8aba1d9" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.148236 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d495-account-create-b2mz9" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.149021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-75a2-account-create-wwhfk" event={"ID":"f79ad5eb-fe16-4595-a535-664d15aba98a","Type":"ContainerDied","Data":"288976e74ca2c3550006f658fcefe8de42658848f7d338c9a84c9282bad34029"} Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.149066 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288976e74ca2c3550006f658fcefe8de42658848f7d338c9a84c9282bad34029" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.149045 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-75a2-account-create-wwhfk" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.153378 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr8pw" event={"ID":"836ddb92-3947-4955-8dc6-ec8e96357ca4","Type":"ContainerDied","Data":"e086a7752b43b5dd8f6d3383728a2dafe45a74016e3d0bc07dd94b78921161f6"} Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.153416 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e086a7752b43b5dd8f6d3383728a2dafe45a74016e3d0bc07dd94b78921161f6" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.153433 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr8pw" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.155772 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-885b-account-create-79txk" event={"ID":"46d2f699-5508-477b-8f5f-a80feb9a10b3","Type":"ContainerDied","Data":"9f55c99e30b4df9bec47d97a42f983ba3b4199efef0c55749868825cc3a7252c"} Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.155830 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f55c99e30b4df9bec47d97a42f983ba3b4199efef0c55749868825cc3a7252c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.156125 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-885b-account-create-79txk" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.695901 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jr8pw"] Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.708140 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jr8pw"] Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782154 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jg47c"] Nov 22 03:09:53 crc kubenswrapper[4922]: E1122 03:09:53.782614 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019d6d6f-c2d5-4b02-b771-7c011fba99ba" containerName="init" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782632 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="019d6d6f-c2d5-4b02-b771-7c011fba99ba" containerName="init" Nov 22 03:09:53 crc kubenswrapper[4922]: E1122 03:09:53.782658 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79ad5eb-fe16-4595-a535-664d15aba98a" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782668 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79ad5eb-fe16-4595-a535-664d15aba98a" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: E1122 03:09:53.782680 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f20a52-548e-471f-9db9-92056b41111d" containerName="init" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782688 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f20a52-548e-471f-9db9-92056b41111d" containerName="init" Nov 22 03:09:53 crc kubenswrapper[4922]: E1122 03:09:53.782704 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f2e3f8-0b26-4534-a01c-f261d5048821" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782712 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f2e3f8-0b26-4534-a01c-f261d5048821" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: E1122 03:09:53.782721 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d2f699-5508-477b-8f5f-a80feb9a10b3" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782729 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d2f699-5508-477b-8f5f-a80feb9a10b3" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: E1122 03:09:53.782742 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836ddb92-3947-4955-8dc6-ec8e96357ca4" containerName="keystone-bootstrap" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782751 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="836ddb92-3947-4955-8dc6-ec8e96357ca4" containerName="keystone-bootstrap" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.782990 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d2f699-5508-477b-8f5f-a80feb9a10b3" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.783042 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f20a52-548e-471f-9db9-92056b41111d" containerName="init" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.783074 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f2e3f8-0b26-4534-a01c-f261d5048821" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.783092 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="836ddb92-3947-4955-8dc6-ec8e96357ca4" containerName="keystone-bootstrap" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.783113 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79ad5eb-fe16-4595-a535-664d15aba98a" containerName="mariadb-account-create" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.783137 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="019d6d6f-c2d5-4b02-b771-7c011fba99ba" containerName="init" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.783998 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.786380 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cbjdr" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.786720 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.787214 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.788599 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.796679 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jg47c"] Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.860760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-scripts\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.860882 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-fernet-keys\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.860927 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-credential-keys\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.860961 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmskl\" (UniqueName: \"kubernetes.io/projected/61948bb8-1797-44d7-946a-906a010895b6-kube-api-access-cmskl\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.861024 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-config-data\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.861047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-combined-ca-bundle\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.963016 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-scripts\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.963122 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-fernet-keys\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.963160 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-credential-keys\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.963187 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmskl\" (UniqueName: \"kubernetes.io/projected/61948bb8-1797-44d7-946a-906a010895b6-kube-api-access-cmskl\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.963243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-config-data\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.963270 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-combined-ca-bundle\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.969042 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-scripts\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.969663 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-fernet-keys\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.970059 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-credential-keys\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.976510 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-combined-ca-bundle\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.981450 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmskl\" (UniqueName: \"kubernetes.io/projected/61948bb8-1797-44d7-946a-906a010895b6-kube-api-access-cmskl\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:53 crc kubenswrapper[4922]: I1122 03:09:53.982165 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-config-data\") pod \"keystone-bootstrap-jg47c\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:54 crc kubenswrapper[4922]: I1122 03:09:54.107011 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:09:54 crc kubenswrapper[4922]: I1122 03:09:54.166811 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerStarted","Data":"b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59"} Nov 22 03:09:54 crc kubenswrapper[4922]: I1122 03:09:54.169642 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-whfxs" event={"ID":"aeeab218-54fe-4892-b3f8-60b166ad72e2","Type":"ContainerStarted","Data":"04042aa32ebb1743c2e167accffad0efef37dcdf37331bf99fa94d37797805f2"} Nov 22 03:09:54 crc kubenswrapper[4922]: I1122 03:09:54.194285 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-whfxs" podStartSLOduration=2.975812682 podStartE2EDuration="15.194266027s" podCreationTimestamp="2025-11-22 03:09:39 +0000 UTC" firstStartedPulling="2025-11-22 03:09:40.698526694 +0000 UTC m=+1016.737048586" lastFinishedPulling="2025-11-22 03:09:52.916980039 +0000 UTC m=+1028.955501931" observedRunningTime="2025-11-22 03:09:54.187141576 +0000 UTC m=+1030.225663468" watchObservedRunningTime="2025-11-22 03:09:54.194266027 +0000 UTC m=+1030.232787929" Nov 22 03:09:54 crc kubenswrapper[4922]: I1122 03:09:54.713553 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jg47c"] Nov 22 03:09:54 crc kubenswrapper[4922]: W1122 03:09:54.737887 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61948bb8_1797_44d7_946a_906a010895b6.slice/crio-9c59e490386d202212bfec22ec8fb1f1defcf7d562820532d8b248896fff4153 WatchSource:0}: Error finding container 9c59e490386d202212bfec22ec8fb1f1defcf7d562820532d8b248896fff4153: Status 404 returned error can't find the container with id 9c59e490386d202212bfec22ec8fb1f1defcf7d562820532d8b248896fff4153 Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.157285 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.207863 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerStarted","Data":"57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b"} Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.215057 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jg47c" event={"ID":"61948bb8-1797-44d7-946a-906a010895b6","Type":"ContainerStarted","Data":"093fccbff9f62bce9938b23840a13dbd1808acd49654b8622bf6ceaf157a04f5"} Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.215335 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jg47c" event={"ID":"61948bb8-1797-44d7-946a-906a010895b6","Type":"ContainerStarted","Data":"9c59e490386d202212bfec22ec8fb1f1defcf7d562820532d8b248896fff4153"} Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.218934 4922 generic.go:334] "Generic (PLEG): container finished" podID="aeeab218-54fe-4892-b3f8-60b166ad72e2" containerID="04042aa32ebb1743c2e167accffad0efef37dcdf37331bf99fa94d37797805f2" exitCode=0 Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.219142 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-whfxs" event={"ID":"aeeab218-54fe-4892-b3f8-60b166ad72e2","Type":"ContainerDied","Data":"04042aa32ebb1743c2e167accffad0efef37dcdf37331bf99fa94d37797805f2"} Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.235615 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l6qxq"] Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.236383 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" containerName="dnsmasq-dns" containerID="cri-o://38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3" gracePeriod=10 Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.277296 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jg47c" podStartSLOduration=2.277269988 podStartE2EDuration="2.277269988s" podCreationTimestamp="2025-11-22 03:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:09:55.24900249 +0000 UTC m=+1031.287524462" watchObservedRunningTime="2025-11-22 03:09:55.277269988 +0000 UTC m=+1031.315791890" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.323528 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836ddb92-3947-4955-8dc6-ec8e96357ca4" path="/var/lib/kubelet/pods/836ddb92-3947-4955-8dc6-ec8e96357ca4/volumes" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.683695 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xgvlf"] Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.685827 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.687219 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mslgv" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.688359 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.688531 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.707275 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-g8fzk"] Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.708678 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.715458 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.715753 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t7zfj" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.727890 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xgvlf"] Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.741365 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.757142 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g8fzk"] Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.801035 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-dns-svc\") pod \"121927b8-f52a-4b01-89c1-85b1e694906a\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.801238 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/121927b8-f52a-4b01-89c1-85b1e694906a-kube-api-access-r658v\") pod \"121927b8-f52a-4b01-89c1-85b1e694906a\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.801493 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-nb\") pod \"121927b8-f52a-4b01-89c1-85b1e694906a\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.801540 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-sb\") pod \"121927b8-f52a-4b01-89c1-85b1e694906a\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.801774 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-config\") pod \"121927b8-f52a-4b01-89c1-85b1e694906a\" (UID: \"121927b8-f52a-4b01-89c1-85b1e694906a\") " Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglwr\" (UniqueName: \"kubernetes.io/projected/ccb0a287-d346-47ce-9f23-64d1190b5516-kube-api-access-jglwr\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f256e75d-5ff4-4804-bbe6-058ef24fab04-etc-machine-id\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-combined-ca-bundle\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802601 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57hk\" (UniqueName: \"kubernetes.io/projected/f256e75d-5ff4-4804-bbe6-058ef24fab04-kube-api-access-q57hk\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802624 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-scripts\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802651 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-db-sync-config-data\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802675 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-combined-ca-bundle\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-config-data\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.802808 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-db-sync-config-data\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.811514 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/121927b8-f52a-4b01-89c1-85b1e694906a-kube-api-access-r658v" (OuterVolumeSpecName: "kube-api-access-r658v") pod "121927b8-f52a-4b01-89c1-85b1e694906a" (UID: "121927b8-f52a-4b01-89c1-85b1e694906a"). InnerVolumeSpecName "kube-api-access-r658v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.862193 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "121927b8-f52a-4b01-89c1-85b1e694906a" (UID: "121927b8-f52a-4b01-89c1-85b1e694906a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.866134 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "121927b8-f52a-4b01-89c1-85b1e694906a" (UID: "121927b8-f52a-4b01-89c1-85b1e694906a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.872362 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "121927b8-f52a-4b01-89c1-85b1e694906a" (UID: "121927b8-f52a-4b01-89c1-85b1e694906a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.876813 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-config" (OuterVolumeSpecName: "config") pod "121927b8-f52a-4b01-89c1-85b1e694906a" (UID: "121927b8-f52a-4b01-89c1-85b1e694906a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904122 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-db-sync-config-data\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904181 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jglwr\" (UniqueName: \"kubernetes.io/projected/ccb0a287-d346-47ce-9f23-64d1190b5516-kube-api-access-jglwr\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904210 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f256e75d-5ff4-4804-bbe6-058ef24fab04-etc-machine-id\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-combined-ca-bundle\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57hk\" (UniqueName: \"kubernetes.io/projected/f256e75d-5ff4-4804-bbe6-058ef24fab04-kube-api-access-q57hk\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904300 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-scripts\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904318 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-db-sync-config-data\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-combined-ca-bundle\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904362 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-config-data\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904425 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904435 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904444 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r658v\" (UniqueName: \"kubernetes.io/projected/121927b8-f52a-4b01-89c1-85b1e694906a-kube-api-access-r658v\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904455 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.904463 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/121927b8-f52a-4b01-89c1-85b1e694906a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.905263 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f256e75d-5ff4-4804-bbe6-058ef24fab04-etc-machine-id\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.913776 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-db-sync-config-data\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.913816 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-combined-ca-bundle\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.913891 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-combined-ca-bundle\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.914139 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-db-sync-config-data\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.916888 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-scripts\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.920068 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-config-data\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.932017 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57hk\" (UniqueName: \"kubernetes.io/projected/f256e75d-5ff4-4804-bbe6-058ef24fab04-kube-api-access-q57hk\") pod \"cinder-db-sync-xgvlf\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.937040 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jglwr\" (UniqueName: \"kubernetes.io/projected/ccb0a287-d346-47ce-9f23-64d1190b5516-kube-api-access-jglwr\") pod \"barbican-db-sync-g8fzk\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.947694 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-n6frp"] Nov 22 03:09:55 crc kubenswrapper[4922]: E1122 03:09:55.948304 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" containerName="init" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.948331 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" containerName="init" Nov 22 03:09:55 crc kubenswrapper[4922]: E1122 03:09:55.948350 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" containerName="dnsmasq-dns" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.948360 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" containerName="dnsmasq-dns" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.948618 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" containerName="dnsmasq-dns" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.949262 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.951687 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.951797 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.952231 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nztrr" Nov 22 03:09:55 crc kubenswrapper[4922]: I1122 03:09:55.963612 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n6frp"] Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.005365 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzpr\" (UniqueName: \"kubernetes.io/projected/06c56a5a-f992-43bb-bcd0-15f23d824242-kube-api-access-pdzpr\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.005464 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-combined-ca-bundle\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.005490 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-config\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.069014 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.086626 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.107116 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzpr\" (UniqueName: \"kubernetes.io/projected/06c56a5a-f992-43bb-bcd0-15f23d824242-kube-api-access-pdzpr\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.107207 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-combined-ca-bundle\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.107239 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-config\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.111625 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-combined-ca-bundle\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.116785 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-config\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.134813 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzpr\" (UniqueName: \"kubernetes.io/projected/06c56a5a-f992-43bb-bcd0-15f23d824242-kube-api-access-pdzpr\") pod \"neutron-db-sync-n6frp\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.232920 4922 generic.go:334] "Generic (PLEG): container finished" podID="121927b8-f52a-4b01-89c1-85b1e694906a" containerID="38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3" exitCode=0 Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.232958 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.232997 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" event={"ID":"121927b8-f52a-4b01-89c1-85b1e694906a","Type":"ContainerDied","Data":"38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3"} Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.233334 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-l6qxq" event={"ID":"121927b8-f52a-4b01-89c1-85b1e694906a","Type":"ContainerDied","Data":"f6b79a5b818e825ea321f33f4d4273f9a88cdaf6cf47902035654aa7e7570d23"} Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.233354 4922 scope.go:117] "RemoveContainer" containerID="38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.266028 4922 scope.go:117] "RemoveContainer" containerID="00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.266729 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n6frp" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.268128 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l6qxq"] Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.278458 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-l6qxq"] Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.316987 4922 scope.go:117] "RemoveContainer" containerID="38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3" Nov 22 03:09:56 crc kubenswrapper[4922]: E1122 03:09:56.319614 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3\": container with ID starting with 38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3 not found: ID does not exist" containerID="38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.319650 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3"} err="failed to get container status \"38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3\": rpc error: code = NotFound desc = could not find container \"38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3\": container with ID starting with 38ae327a6653dd7e3f571e10328cae3893315b5aa08c4541c1d98eca93ff60d3 not found: ID does not exist" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.319670 4922 scope.go:117] "RemoveContainer" containerID="00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5" Nov 22 03:09:56 crc kubenswrapper[4922]: E1122 03:09:56.321118 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5\": container with ID starting with 00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5 not found: ID does not exist" containerID="00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5" Nov 22 03:09:56 crc kubenswrapper[4922]: I1122 03:09:56.321142 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5"} err="failed to get container status \"00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5\": rpc error: code = NotFound desc = could not find container \"00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5\": container with ID starting with 00e35e396a23745e5d2859db0f3fcc59a9c900a40c0c46edb4847e18519c29c5 not found: ID does not exist" Nov 22 03:09:57 crc kubenswrapper[4922]: I1122 03:09:56.535305 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xgvlf"] Nov 22 03:09:57 crc kubenswrapper[4922]: I1122 03:09:56.595386 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g8fzk"] Nov 22 03:09:57 crc kubenswrapper[4922]: I1122 03:09:57.249164 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgvlf" event={"ID":"f256e75d-5ff4-4804-bbe6-058ef24fab04","Type":"ContainerStarted","Data":"3caf19fbb4e921f7dd1e71cbf07c058f7115311e7cbb950748606bb489e773be"} Nov 22 03:09:57 crc kubenswrapper[4922]: I1122 03:09:57.251985 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g8fzk" event={"ID":"ccb0a287-d346-47ce-9f23-64d1190b5516","Type":"ContainerStarted","Data":"9ba55d02f29ff85d8430e49e82d36cc5beac3792ca8c36f3fdf1ee421754df64"} Nov 22 03:09:57 crc kubenswrapper[4922]: I1122 03:09:57.311655 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="121927b8-f52a-4b01-89c1-85b1e694906a" path="/var/lib/kubelet/pods/121927b8-f52a-4b01-89c1-85b1e694906a/volumes" Nov 22 03:09:57 crc kubenswrapper[4922]: I1122 03:09:57.418352 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n6frp"] Nov 22 03:09:58 crc kubenswrapper[4922]: I1122 03:09:58.262425 4922 generic.go:334] "Generic (PLEG): container finished" podID="61948bb8-1797-44d7-946a-906a010895b6" containerID="093fccbff9f62bce9938b23840a13dbd1808acd49654b8622bf6ceaf157a04f5" exitCode=0 Nov 22 03:09:58 crc kubenswrapper[4922]: I1122 03:09:58.262491 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jg47c" event={"ID":"61948bb8-1797-44d7-946a-906a010895b6","Type":"ContainerDied","Data":"093fccbff9f62bce9938b23840a13dbd1808acd49654b8622bf6ceaf157a04f5"} Nov 22 03:09:58 crc kubenswrapper[4922]: W1122 03:09:58.393442 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06c56a5a_f992_43bb_bcd0_15f23d824242.slice/crio-1c0f1eb3efd10daf68cf9542ad9b19f15cf40a7fddaa8f1be85c511768c46140 WatchSource:0}: Error finding container 1c0f1eb3efd10daf68cf9542ad9b19f15cf40a7fddaa8f1be85c511768c46140: Status 404 returned error can't find the container with id 1c0f1eb3efd10daf68cf9542ad9b19f15cf40a7fddaa8f1be85c511768c46140 Nov 22 03:09:59 crc kubenswrapper[4922]: I1122 03:09:59.271927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n6frp" event={"ID":"06c56a5a-f992-43bb-bcd0-15f23d824242","Type":"ContainerStarted","Data":"1c0f1eb3efd10daf68cf9542ad9b19f15cf40a7fddaa8f1be85c511768c46140"} Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.052984 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-whfxs" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.068972 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.194969 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnnlp\" (UniqueName: \"kubernetes.io/projected/aeeab218-54fe-4892-b3f8-60b166ad72e2-kube-api-access-qnnlp\") pod \"aeeab218-54fe-4892-b3f8-60b166ad72e2\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195018 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-credential-keys\") pod \"61948bb8-1797-44d7-946a-906a010895b6\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195062 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-combined-ca-bundle\") pod \"aeeab218-54fe-4892-b3f8-60b166ad72e2\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195111 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-fernet-keys\") pod \"61948bb8-1797-44d7-946a-906a010895b6\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195221 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeeab218-54fe-4892-b3f8-60b166ad72e2-logs\") pod \"aeeab218-54fe-4892-b3f8-60b166ad72e2\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195265 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-config-data\") pod \"61948bb8-1797-44d7-946a-906a010895b6\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195296 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-config-data\") pod \"aeeab218-54fe-4892-b3f8-60b166ad72e2\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195356 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmskl\" (UniqueName: \"kubernetes.io/projected/61948bb8-1797-44d7-946a-906a010895b6-kube-api-access-cmskl\") pod \"61948bb8-1797-44d7-946a-906a010895b6\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195376 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-combined-ca-bundle\") pod \"61948bb8-1797-44d7-946a-906a010895b6\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195401 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-scripts\") pod \"aeeab218-54fe-4892-b3f8-60b166ad72e2\" (UID: \"aeeab218-54fe-4892-b3f8-60b166ad72e2\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.195423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-scripts\") pod \"61948bb8-1797-44d7-946a-906a010895b6\" (UID: \"61948bb8-1797-44d7-946a-906a010895b6\") " Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.196561 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeeab218-54fe-4892-b3f8-60b166ad72e2-logs" (OuterVolumeSpecName: "logs") pod "aeeab218-54fe-4892-b3f8-60b166ad72e2" (UID: "aeeab218-54fe-4892-b3f8-60b166ad72e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.200031 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-scripts" (OuterVolumeSpecName: "scripts") pod "61948bb8-1797-44d7-946a-906a010895b6" (UID: "61948bb8-1797-44d7-946a-906a010895b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.201341 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61948bb8-1797-44d7-946a-906a010895b6-kube-api-access-cmskl" (OuterVolumeSpecName: "kube-api-access-cmskl") pod "61948bb8-1797-44d7-946a-906a010895b6" (UID: "61948bb8-1797-44d7-946a-906a010895b6"). InnerVolumeSpecName "kube-api-access-cmskl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.203868 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "61948bb8-1797-44d7-946a-906a010895b6" (UID: "61948bb8-1797-44d7-946a-906a010895b6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.204395 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "61948bb8-1797-44d7-946a-906a010895b6" (UID: "61948bb8-1797-44d7-946a-906a010895b6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.206781 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-scripts" (OuterVolumeSpecName: "scripts") pod "aeeab218-54fe-4892-b3f8-60b166ad72e2" (UID: "aeeab218-54fe-4892-b3f8-60b166ad72e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.211088 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeeab218-54fe-4892-b3f8-60b166ad72e2-kube-api-access-qnnlp" (OuterVolumeSpecName: "kube-api-access-qnnlp") pod "aeeab218-54fe-4892-b3f8-60b166ad72e2" (UID: "aeeab218-54fe-4892-b3f8-60b166ad72e2"). InnerVolumeSpecName "kube-api-access-qnnlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.228298 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61948bb8-1797-44d7-946a-906a010895b6" (UID: "61948bb8-1797-44d7-946a-906a010895b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.235401 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-config-data" (OuterVolumeSpecName: "config-data") pod "aeeab218-54fe-4892-b3f8-60b166ad72e2" (UID: "aeeab218-54fe-4892-b3f8-60b166ad72e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.239047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-config-data" (OuterVolumeSpecName: "config-data") pod "61948bb8-1797-44d7-946a-906a010895b6" (UID: "61948bb8-1797-44d7-946a-906a010895b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.240735 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeeab218-54fe-4892-b3f8-60b166ad72e2" (UID: "aeeab218-54fe-4892-b3f8-60b166ad72e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.282464 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jg47c" event={"ID":"61948bb8-1797-44d7-946a-906a010895b6","Type":"ContainerDied","Data":"9c59e490386d202212bfec22ec8fb1f1defcf7d562820532d8b248896fff4153"} Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.283217 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c59e490386d202212bfec22ec8fb1f1defcf7d562820532d8b248896fff4153" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.283191 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jg47c" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.288129 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-whfxs" event={"ID":"aeeab218-54fe-4892-b3f8-60b166ad72e2","Type":"ContainerDied","Data":"a6965cc16c80b4bea051c177cfd66dbd2ebe5a52e45092dd58794f1596534ec8"} Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.288209 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6965cc16c80b4bea051c177cfd66dbd2ebe5a52e45092dd58794f1596534ec8" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.288209 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-whfxs" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.290067 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n6frp" event={"ID":"06c56a5a-f992-43bb-bcd0-15f23d824242","Type":"ContainerStarted","Data":"0503a79cf0c11dd129c952e2c01f8d01603f051e8525895fa57b3995fef8239a"} Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297565 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297599 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeeab218-54fe-4892-b3f8-60b166ad72e2-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297611 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297623 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297636 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmskl\" (UniqueName: \"kubernetes.io/projected/61948bb8-1797-44d7-946a-906a010895b6-kube-api-access-cmskl\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297650 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297661 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297673 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297683 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnnlp\" (UniqueName: \"kubernetes.io/projected/aeeab218-54fe-4892-b3f8-60b166ad72e2-kube-api-access-qnnlp\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297694 4922 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61948bb8-1797-44d7-946a-906a010895b6-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.297706 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeeab218-54fe-4892-b3f8-60b166ad72e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.384383 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d6fddd8cd-k2kd6"] Nov 22 03:10:00 crc kubenswrapper[4922]: E1122 03:10:00.384788 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61948bb8-1797-44d7-946a-906a010895b6" containerName="keystone-bootstrap" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.384806 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="61948bb8-1797-44d7-946a-906a010895b6" containerName="keystone-bootstrap" Nov 22 03:10:00 crc kubenswrapper[4922]: E1122 03:10:00.384862 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeeab218-54fe-4892-b3f8-60b166ad72e2" containerName="placement-db-sync" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.384873 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeeab218-54fe-4892-b3f8-60b166ad72e2" containerName="placement-db-sync" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.385099 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="61948bb8-1797-44d7-946a-906a010895b6" containerName="keystone-bootstrap" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.385129 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeeab218-54fe-4892-b3f8-60b166ad72e2" containerName="placement-db-sync" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.385784 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.387478 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.393507 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.393508 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.393634 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.393888 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-cbjdr" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.394559 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.408460 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d6fddd8cd-k2kd6"] Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.500650 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-config-data\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.500900 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmfl\" (UniqueName: \"kubernetes.io/projected/629884e5-288f-4eda-a710-d6935610a2ad-kube-api-access-cwmfl\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.500945 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-scripts\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.501134 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-combined-ca-bundle\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.501184 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-public-tls-certs\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.501246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-fernet-keys\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.501315 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-internal-tls-certs\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.501426 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-credential-keys\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.603253 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-fernet-keys\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.603306 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-internal-tls-certs\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.603358 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-credential-keys\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.605192 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-config-data\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.605347 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmfl\" (UniqueName: \"kubernetes.io/projected/629884e5-288f-4eda-a710-d6935610a2ad-kube-api-access-cwmfl\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.605379 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-scripts\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.605446 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-combined-ca-bundle\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.605470 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-public-tls-certs\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.607489 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-credential-keys\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.607791 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-internal-tls-certs\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.607799 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-fernet-keys\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.609343 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-combined-ca-bundle\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.609894 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-config-data\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.611107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-scripts\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.613022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/629884e5-288f-4eda-a710-d6935610a2ad-public-tls-certs\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.624927 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmfl\" (UniqueName: \"kubernetes.io/projected/629884e5-288f-4eda-a710-d6935610a2ad-kube-api-access-cwmfl\") pod \"keystone-d6fddd8cd-k2kd6\" (UID: \"629884e5-288f-4eda-a710-d6935610a2ad\") " pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:00 crc kubenswrapper[4922]: I1122 03:10:00.703757 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.133023 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d6fddd8cd-k2kd6"] Nov 22 03:10:01 crc kubenswrapper[4922]: W1122 03:10:01.157193 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod629884e5_288f_4eda_a710_d6935610a2ad.slice/crio-0742678ec5cda0e873c9b668b903bdc7a83e1cc5a87e5c554a7b60d711a77f7f WatchSource:0}: Error finding container 0742678ec5cda0e873c9b668b903bdc7a83e1cc5a87e5c554a7b60d711a77f7f: Status 404 returned error can't find the container with id 0742678ec5cda0e873c9b668b903bdc7a83e1cc5a87e5c554a7b60d711a77f7f Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.247211 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56c6fc5546-zz2lj"] Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.248637 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.253071 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.253227 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.253397 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.253445 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.253696 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m5xtt" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.254352 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56c6fc5546-zz2lj"] Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.317804 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d6fddd8cd-k2kd6" event={"ID":"629884e5-288f-4eda-a710-d6935610a2ad","Type":"ContainerStarted","Data":"0742678ec5cda0e873c9b668b903bdc7a83e1cc5a87e5c554a7b60d711a77f7f"} Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.317855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerStarted","Data":"d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a"} Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.318663 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-scripts\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.318687 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pjm4\" (UniqueName: \"kubernetes.io/projected/3db3f190-2b55-424f-bbe9-d52042f900ef-kube-api-access-8pjm4\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.318710 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3f190-2b55-424f-bbe9-d52042f900ef-logs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.318760 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-config-data\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.318920 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-combined-ca-bundle\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.318989 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-internal-tls-certs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.319098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-public-tls-certs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.333331 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-n6frp" podStartSLOduration=6.333297975 podStartE2EDuration="6.333297975s" podCreationTimestamp="2025-11-22 03:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:01.327356353 +0000 UTC m=+1037.365878245" watchObservedRunningTime="2025-11-22 03:10:01.333297975 +0000 UTC m=+1037.371819867" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420166 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-config-data\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420574 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-combined-ca-bundle\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420605 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-internal-tls-certs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420654 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-public-tls-certs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420707 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-scripts\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420729 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pjm4\" (UniqueName: \"kubernetes.io/projected/3db3f190-2b55-424f-bbe9-d52042f900ef-kube-api-access-8pjm4\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.420754 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3f190-2b55-424f-bbe9-d52042f900ef-logs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.421154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3db3f190-2b55-424f-bbe9-d52042f900ef-logs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.424943 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-config-data\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.425050 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-combined-ca-bundle\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.425811 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-internal-tls-certs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.426183 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-public-tls-certs\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.426370 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db3f190-2b55-424f-bbe9-d52042f900ef-scripts\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.438952 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pjm4\" (UniqueName: \"kubernetes.io/projected/3db3f190-2b55-424f-bbe9-d52042f900ef-kube-api-access-8pjm4\") pod \"placement-56c6fc5546-zz2lj\" (UID: \"3db3f190-2b55-424f-bbe9-d52042f900ef\") " pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:01 crc kubenswrapper[4922]: I1122 03:10:01.596939 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:02 crc kubenswrapper[4922]: I1122 03:10:02.045267 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56c6fc5546-zz2lj"] Nov 22 03:10:02 crc kubenswrapper[4922]: W1122 03:10:02.054993 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3db3f190_2b55_424f_bbe9_d52042f900ef.slice/crio-6956bd34fc8ee0e346b614e89760dac6955206ba2cfc182038ca1092c5616c96 WatchSource:0}: Error finding container 6956bd34fc8ee0e346b614e89760dac6955206ba2cfc182038ca1092c5616c96: Status 404 returned error can't find the container with id 6956bd34fc8ee0e346b614e89760dac6955206ba2cfc182038ca1092c5616c96 Nov 22 03:10:02 crc kubenswrapper[4922]: I1122 03:10:02.325586 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d6fddd8cd-k2kd6" event={"ID":"629884e5-288f-4eda-a710-d6935610a2ad","Type":"ContainerStarted","Data":"cc90bdd2c8edb489f86363e916d8b833a8ab48010e7724c08fac6d2551814967"} Nov 22 03:10:02 crc kubenswrapper[4922]: I1122 03:10:02.325754 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:02 crc kubenswrapper[4922]: I1122 03:10:02.327154 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56c6fc5546-zz2lj" event={"ID":"3db3f190-2b55-424f-bbe9-d52042f900ef","Type":"ContainerStarted","Data":"6956bd34fc8ee0e346b614e89760dac6955206ba2cfc182038ca1092c5616c96"} Nov 22 03:10:02 crc kubenswrapper[4922]: I1122 03:10:02.354683 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d6fddd8cd-k2kd6" podStartSLOduration=2.354663119 podStartE2EDuration="2.354663119s" podCreationTimestamp="2025-11-22 03:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:02.348343837 +0000 UTC m=+1038.386865739" watchObservedRunningTime="2025-11-22 03:10:02.354663119 +0000 UTC m=+1038.393185011" Nov 22 03:10:26 crc kubenswrapper[4922]: E1122 03:10:26.528270 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Nov 22 03:10:26 crc kubenswrapper[4922]: E1122 03:10:26.528962 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ltr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e99d940a-73cb-41bb-b5a9-7003544b002a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 03:10:26 crc kubenswrapper[4922]: E1122 03:10:26.530519 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" Nov 22 03:10:26 crc kubenswrapper[4922]: I1122 03:10:26.558420 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-central-agent" containerID="cri-o://b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59" gracePeriod=30 Nov 22 03:10:26 crc kubenswrapper[4922]: I1122 03:10:26.559020 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="sg-core" containerID="cri-o://d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a" gracePeriod=30 Nov 22 03:10:26 crc kubenswrapper[4922]: I1122 03:10:26.559192 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-notification-agent" containerID="cri-o://57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b" gracePeriod=30 Nov 22 03:10:27 crc kubenswrapper[4922]: I1122 03:10:27.571063 4922 generic.go:334] "Generic (PLEG): container finished" podID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerID="d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a" exitCode=2 Nov 22 03:10:27 crc kubenswrapper[4922]: I1122 03:10:27.571115 4922 generic.go:334] "Generic (PLEG): container finished" podID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerID="b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59" exitCode=0 Nov 22 03:10:27 crc kubenswrapper[4922]: I1122 03:10:27.571140 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerDied","Data":"d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a"} Nov 22 03:10:27 crc kubenswrapper[4922]: I1122 03:10:27.571170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerDied","Data":"b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59"} Nov 22 03:10:27 crc kubenswrapper[4922]: E1122 03:10:27.588235 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 22 03:10:27 crc kubenswrapper[4922]: E1122 03:10:27.588458 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q57hk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xgvlf_openstack(f256e75d-5ff4-4804-bbe6-058ef24fab04): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:10:27 crc kubenswrapper[4922]: E1122 03:10:27.589757 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xgvlf" podUID="f256e75d-5ff4-4804-bbe6-058ef24fab04" Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.585474 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g8fzk" event={"ID":"ccb0a287-d346-47ce-9f23-64d1190b5516","Type":"ContainerStarted","Data":"743cbaf1ce9cefac103aa8037e0e5f75e0d4e54b1c337c3acd037c4accf1f912"} Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.588456 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56c6fc5546-zz2lj" event={"ID":"3db3f190-2b55-424f-bbe9-d52042f900ef","Type":"ContainerStarted","Data":"aa0b339d31a473f39f2baaa633e98ce3c7c46f82c208de03e303489f0d2aabb8"} Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.588488 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56c6fc5546-zz2lj" event={"ID":"3db3f190-2b55-424f-bbe9-d52042f900ef","Type":"ContainerStarted","Data":"dee764c10d168a9979f6a303432d5cc3f5a6f9972d6eaa748f29447f6b76be8a"} Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.588621 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.588652 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:10:28 crc kubenswrapper[4922]: E1122 03:10:28.591094 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-xgvlf" podUID="f256e75d-5ff4-4804-bbe6-058ef24fab04" Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.619706 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-g8fzk" podStartSLOduration=2.6446468960000002 podStartE2EDuration="33.619647273s" podCreationTimestamp="2025-11-22 03:09:55 +0000 UTC" firstStartedPulling="2025-11-22 03:09:56.607285869 +0000 UTC m=+1032.645807761" lastFinishedPulling="2025-11-22 03:10:27.582286246 +0000 UTC m=+1063.620808138" observedRunningTime="2025-11-22 03:10:28.61161223 +0000 UTC m=+1064.650134132" watchObservedRunningTime="2025-11-22 03:10:28.619647273 +0000 UTC m=+1064.658169205" Nov 22 03:10:28 crc kubenswrapper[4922]: I1122 03:10:28.683586 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56c6fc5546-zz2lj" podStartSLOduration=27.683566714 podStartE2EDuration="27.683566714s" podCreationTimestamp="2025-11-22 03:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:28.680153143 +0000 UTC m=+1064.718675075" watchObservedRunningTime="2025-11-22 03:10:28.683566714 +0000 UTC m=+1064.722088616" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.457322 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613251 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ltr8\" (UniqueName: \"kubernetes.io/projected/e99d940a-73cb-41bb-b5a9-7003544b002a-kube-api-access-4ltr8\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613355 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-config-data\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613396 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-combined-ca-bundle\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613583 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-sg-core-conf-yaml\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613633 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-scripts\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-run-httpd\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.613819 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-log-httpd\") pod \"e99d940a-73cb-41bb-b5a9-7003544b002a\" (UID: \"e99d940a-73cb-41bb-b5a9-7003544b002a\") " Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.614762 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.614799 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.632445 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-scripts" (OuterVolumeSpecName: "scripts") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.644607 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99d940a-73cb-41bb-b5a9-7003544b002a-kube-api-access-4ltr8" (OuterVolumeSpecName: "kube-api-access-4ltr8") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "kube-api-access-4ltr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.656367 4922 generic.go:334] "Generic (PLEG): container finished" podID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerID="57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b" exitCode=0 Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.656433 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerDied","Data":"57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b"} Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.656540 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e99d940a-73cb-41bb-b5a9-7003544b002a","Type":"ContainerDied","Data":"f4b55afc81873f7cf24d4621dba6aab220be2ea33006f7a2e9ba4ffbc2fc6d2b"} Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.656570 4922 scope.go:117] "RemoveContainer" containerID="d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.656744 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.663800 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.673582 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.673832 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-config-data" (OuterVolumeSpecName: "config-data") pod "e99d940a-73cb-41bb-b5a9-7003544b002a" (UID: "e99d940a-73cb-41bb-b5a9-7003544b002a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.716814 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.716895 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e99d940a-73cb-41bb-b5a9-7003544b002a-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.716949 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ltr8\" (UniqueName: \"kubernetes.io/projected/e99d940a-73cb-41bb-b5a9-7003544b002a-kube-api-access-4ltr8\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.716974 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.716993 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.717012 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.717030 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e99d940a-73cb-41bb-b5a9-7003544b002a-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.735980 4922 scope.go:117] "RemoveContainer" containerID="57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.769780 4922 scope.go:117] "RemoveContainer" containerID="b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.800310 4922 scope.go:117] "RemoveContainer" containerID="d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a" Nov 22 03:10:31 crc kubenswrapper[4922]: E1122 03:10:31.800919 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a\": container with ID starting with d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a not found: ID does not exist" containerID="d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.800962 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a"} err="failed to get container status \"d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a\": rpc error: code = NotFound desc = could not find container \"d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a\": container with ID starting with d3b67e53cdecd8e19f9ced1fd26bd1a5fffae6c39f64a2d219f9e9f80ac0cd4a not found: ID does not exist" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.800989 4922 scope.go:117] "RemoveContainer" containerID="57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b" Nov 22 03:10:31 crc kubenswrapper[4922]: E1122 03:10:31.801349 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b\": container with ID starting with 57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b not found: ID does not exist" containerID="57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.801389 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b"} err="failed to get container status \"57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b\": rpc error: code = NotFound desc = could not find container \"57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b\": container with ID starting with 57a13d045747ad4e088b7844837786eff94bb58defba29150f3e49534cbda42b not found: ID does not exist" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.801415 4922 scope.go:117] "RemoveContainer" containerID="b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59" Nov 22 03:10:31 crc kubenswrapper[4922]: E1122 03:10:31.801694 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59\": container with ID starting with b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59 not found: ID does not exist" containerID="b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59" Nov 22 03:10:31 crc kubenswrapper[4922]: I1122 03:10:31.801730 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59"} err="failed to get container status \"b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59\": rpc error: code = NotFound desc = could not find container \"b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59\": container with ID starting with b1a0b9314347d3079e787b0ae400eaa8b91893dd85f653e98b7d90fb8cc6ac59 not found: ID does not exist" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.033402 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.041027 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.066686 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:32 crc kubenswrapper[4922]: E1122 03:10:32.067109 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="sg-core" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.067129 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="sg-core" Nov 22 03:10:32 crc kubenswrapper[4922]: E1122 03:10:32.067161 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-notification-agent" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.067171 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-notification-agent" Nov 22 03:10:32 crc kubenswrapper[4922]: E1122 03:10:32.067194 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-central-agent" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.067203 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-central-agent" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.067400 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-central-agent" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.067426 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="sg-core" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.067451 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" containerName="ceilometer-notification-agent" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.069304 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.073036 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.075762 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.097519 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.225693 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-config-data\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.225787 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9kdf\" (UniqueName: \"kubernetes.io/projected/4169244c-c975-4716-a7f1-53bf5c0dafe6-kube-api-access-f9kdf\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.225822 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.225908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-log-httpd\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.225948 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-run-httpd\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.225969 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.226030 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-scripts\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.266213 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-d6fddd8cd-k2kd6" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328377 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-scripts\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328523 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-config-data\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328572 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9kdf\" (UniqueName: \"kubernetes.io/projected/4169244c-c975-4716-a7f1-53bf5c0dafe6-kube-api-access-f9kdf\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328631 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-log-httpd\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328734 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-run-httpd\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.328775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.329908 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-run-httpd\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.330212 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-log-httpd\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.334283 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-scripts\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.335056 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.335472 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-config-data\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.335759 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.349299 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9kdf\" (UniqueName: \"kubernetes.io/projected/4169244c-c975-4716-a7f1-53bf5c0dafe6-kube-api-access-f9kdf\") pod \"ceilometer-0\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.391366 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:32 crc kubenswrapper[4922]: I1122 03:10:32.837085 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:32 crc kubenswrapper[4922]: W1122 03:10:32.846213 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4169244c_c975_4716_a7f1_53bf5c0dafe6.slice/crio-6c994719b564d129987dd61b1e16323b935997aa174fe1f8e38682a176d3b849 WatchSource:0}: Error finding container 6c994719b564d129987dd61b1e16323b935997aa174fe1f8e38682a176d3b849: Status 404 returned error can't find the container with id 6c994719b564d129987dd61b1e16323b935997aa174fe1f8e38682a176d3b849 Nov 22 03:10:33 crc kubenswrapper[4922]: I1122 03:10:33.313638 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e99d940a-73cb-41bb-b5a9-7003544b002a" path="/var/lib/kubelet/pods/e99d940a-73cb-41bb-b5a9-7003544b002a/volumes" Nov 22 03:10:33 crc kubenswrapper[4922]: I1122 03:10:33.686919 4922 generic.go:334] "Generic (PLEG): container finished" podID="ccb0a287-d346-47ce-9f23-64d1190b5516" containerID="743cbaf1ce9cefac103aa8037e0e5f75e0d4e54b1c337c3acd037c4accf1f912" exitCode=0 Nov 22 03:10:33 crc kubenswrapper[4922]: I1122 03:10:33.687295 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g8fzk" event={"ID":"ccb0a287-d346-47ce-9f23-64d1190b5516","Type":"ContainerDied","Data":"743cbaf1ce9cefac103aa8037e0e5f75e0d4e54b1c337c3acd037c4accf1f912"} Nov 22 03:10:33 crc kubenswrapper[4922]: I1122 03:10:33.692566 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerStarted","Data":"8cdbee1e9de52dd5ff6b7cae597cee93d0f78826a24525c8ba7d48eaf51dbf56"} Nov 22 03:10:33 crc kubenswrapper[4922]: I1122 03:10:33.692616 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerStarted","Data":"6c994719b564d129987dd61b1e16323b935997aa174fe1f8e38682a176d3b849"} Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.272781 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerStarted","Data":"e678fba9c22764baad2f0b3dab1c3f45fe6a8987c0b86bb04c111f90c493dda9"} Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.380893 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.382171 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.384478 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pbsdp" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.392689 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.392690 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.397764 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.505658 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lqh8\" (UniqueName: \"kubernetes.io/projected/8d23126d-97a4-4ed0-a589-0ef607e832ed-kube-api-access-7lqh8\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.505742 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d23126d-97a4-4ed0-a589-0ef607e832ed-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.505830 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d23126d-97a4-4ed0-a589-0ef607e832ed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.505906 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d23126d-97a4-4ed0-a589-0ef607e832ed-openstack-config\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.608084 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lqh8\" (UniqueName: \"kubernetes.io/projected/8d23126d-97a4-4ed0-a589-0ef607e832ed-kube-api-access-7lqh8\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.608463 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d23126d-97a4-4ed0-a589-0ef607e832ed-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.608526 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d23126d-97a4-4ed0-a589-0ef607e832ed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.608586 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d23126d-97a4-4ed0-a589-0ef607e832ed-openstack-config\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.609503 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8d23126d-97a4-4ed0-a589-0ef607e832ed-openstack-config\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.614537 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8d23126d-97a4-4ed0-a589-0ef607e832ed-openstack-config-secret\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.614544 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d23126d-97a4-4ed0-a589-0ef607e832ed-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.625538 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lqh8\" (UniqueName: \"kubernetes.io/projected/8d23126d-97a4-4ed0-a589-0ef607e832ed-kube-api-access-7lqh8\") pod \"openstackclient\" (UID: \"8d23126d-97a4-4ed0-a589-0ef607e832ed\") " pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.753933 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.766308 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.811469 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-combined-ca-bundle\") pod \"ccb0a287-d346-47ce-9f23-64d1190b5516\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.811552 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jglwr\" (UniqueName: \"kubernetes.io/projected/ccb0a287-d346-47ce-9f23-64d1190b5516-kube-api-access-jglwr\") pod \"ccb0a287-d346-47ce-9f23-64d1190b5516\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.811764 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-db-sync-config-data\") pod \"ccb0a287-d346-47ce-9f23-64d1190b5516\" (UID: \"ccb0a287-d346-47ce-9f23-64d1190b5516\") " Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.816976 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb0a287-d346-47ce-9f23-64d1190b5516-kube-api-access-jglwr" (OuterVolumeSpecName: "kube-api-access-jglwr") pod "ccb0a287-d346-47ce-9f23-64d1190b5516" (UID: "ccb0a287-d346-47ce-9f23-64d1190b5516"). InnerVolumeSpecName "kube-api-access-jglwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.826034 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ccb0a287-d346-47ce-9f23-64d1190b5516" (UID: "ccb0a287-d346-47ce-9f23-64d1190b5516"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.849277 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb0a287-d346-47ce-9f23-64d1190b5516" (UID: "ccb0a287-d346-47ce-9f23-64d1190b5516"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.914175 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jglwr\" (UniqueName: \"kubernetes.io/projected/ccb0a287-d346-47ce-9f23-64d1190b5516-kube-api-access-jglwr\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.914585 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:35 crc kubenswrapper[4922]: I1122 03:10:35.914595 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb0a287-d346-47ce-9f23-64d1190b5516-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.224766 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 03:10:36 crc kubenswrapper[4922]: W1122 03:10:36.228483 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d23126d_97a4_4ed0_a589_0ef607e832ed.slice/crio-f867d5745ac5f56d9ff4dfb6ad626ca5dba68c0c66b1eada11fc2ba45a46e316 WatchSource:0}: Error finding container f867d5745ac5f56d9ff4dfb6ad626ca5dba68c0c66b1eada11fc2ba45a46e316: Status 404 returned error can't find the container with id f867d5745ac5f56d9ff4dfb6ad626ca5dba68c0c66b1eada11fc2ba45a46e316 Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.282034 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8d23126d-97a4-4ed0-a589-0ef607e832ed","Type":"ContainerStarted","Data":"f867d5745ac5f56d9ff4dfb6ad626ca5dba68c0c66b1eada11fc2ba45a46e316"} Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.285439 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerStarted","Data":"27aea154443df043d91fded0cd22b21faee6a9cd9ede92d80ad1e5366d272bfc"} Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.286790 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g8fzk" event={"ID":"ccb0a287-d346-47ce-9f23-64d1190b5516","Type":"ContainerDied","Data":"9ba55d02f29ff85d8430e49e82d36cc5beac3792ca8c36f3fdf1ee421754df64"} Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.286817 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ba55d02f29ff85d8430e49e82d36cc5beac3792ca8c36f3fdf1ee421754df64" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.286883 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g8fzk" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.574603 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-cf5c7c74c-qqjc7"] Nov 22 03:10:36 crc kubenswrapper[4922]: E1122 03:10:36.574939 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb0a287-d346-47ce-9f23-64d1190b5516" containerName="barbican-db-sync" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.574953 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb0a287-d346-47ce-9f23-64d1190b5516" containerName="barbican-db-sync" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.575143 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb0a287-d346-47ce-9f23-64d1190b5516" containerName="barbican-db-sync" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.575909 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.595120 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.596980 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b7d698c78-tqjxq"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.601384 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.613920 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.613988 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-t7zfj" Nov 22 03:10:36 crc kubenswrapper[4922]: W1122 03:10:36.614738 4922 reflector.go:561] object-"openstack"/"barbican-keystone-listener-config-data": failed to list *v1.Secret: secrets "barbican-keystone-listener-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Nov 22 03:10:36 crc kubenswrapper[4922]: E1122 03:10:36.614779 4922 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"barbican-keystone-listener-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"barbican-keystone-listener-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.642182 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cf5c7c74c-qqjc7"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.665524 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b7d698c78-tqjxq"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.725653 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699df9757c-6xcfl"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.727340 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.740364 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-6xcfl"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-combined-ca-bundle\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749247 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87ad9c28-b442-49bb-b474-005766f23004-logs\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749268 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-config-data\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749324 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-config-data-custom\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749342 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-combined-ca-bundle\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749357 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-config-data-custom\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749377 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec603ae2-8cba-4c61-8733-b448f538780a-logs\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749417 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-config-data\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749449 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d688\" (UniqueName: \"kubernetes.io/projected/ec603ae2-8cba-4c61-8733-b448f538780a-kube-api-access-6d688\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.749466 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbljp\" (UniqueName: \"kubernetes.io/projected/87ad9c28-b442-49bb-b474-005766f23004-kube-api-access-wbljp\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.810991 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d4854c55d-lxcp4"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.813028 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.816533 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.825333 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d4854c55d-lxcp4"] Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851119 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-config-data-custom\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851185 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-combined-ca-bundle\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851212 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-config-data-custom\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851249 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec603ae2-8cba-4c61-8733-b448f538780a-logs\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851314 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851345 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-config-data\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d688\" (UniqueName: \"kubernetes.io/projected/ec603ae2-8cba-4c61-8733-b448f538780a-kube-api-access-6d688\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851420 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbljp\" (UniqueName: \"kubernetes.io/projected/87ad9c28-b442-49bb-b474-005766f23004-kube-api-access-wbljp\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851464 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-combined-ca-bundle\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851481 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87ad9c28-b442-49bb-b474-005766f23004-logs\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851500 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-config-data\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851532 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-config\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851590 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-dns-svc\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851612 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltpw\" (UniqueName: \"kubernetes.io/projected/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-kube-api-access-jltpw\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.851636 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.852991 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec603ae2-8cba-4c61-8733-b448f538780a-logs\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.857148 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-combined-ca-bundle\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.857364 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-config-data\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.857870 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87ad9c28-b442-49bb-b474-005766f23004-logs\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.857903 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-combined-ca-bundle\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.863441 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-config-data-custom\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.867270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec603ae2-8cba-4c61-8733-b448f538780a-config-data\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.879682 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbljp\" (UniqueName: \"kubernetes.io/projected/87ad9c28-b442-49bb-b474-005766f23004-kube-api-access-wbljp\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.895066 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d688\" (UniqueName: \"kubernetes.io/projected/ec603ae2-8cba-4c61-8733-b448f538780a-kube-api-access-6d688\") pod \"barbican-worker-cf5c7c74c-qqjc7\" (UID: \"ec603ae2-8cba-4c61-8733-b448f538780a\") " pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.926273 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-cf5c7c74c-qqjc7" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953635 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-dns-svc\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953677 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jltpw\" (UniqueName: \"kubernetes.io/projected/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-kube-api-access-jltpw\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953752 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrth\" (UniqueName: \"kubernetes.io/projected/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-kube-api-access-ldrth\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953775 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-combined-ca-bundle\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953822 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953863 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-logs\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953881 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953913 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data-custom\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.953957 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-config\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.954707 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-config\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.955449 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-dns-svc\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.956077 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.956830 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:36 crc kubenswrapper[4922]: I1122 03:10:36.976070 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltpw\" (UniqueName: \"kubernetes.io/projected/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-kube-api-access-jltpw\") pod \"dnsmasq-dns-699df9757c-6xcfl\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.046612 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.055978 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrth\" (UniqueName: \"kubernetes.io/projected/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-kube-api-access-ldrth\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.056018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-combined-ca-bundle\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.056069 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-logs\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.056087 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.056113 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data-custom\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.056881 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-logs\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.062014 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-combined-ca-bundle\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.063583 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.065504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data-custom\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.078107 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrth\" (UniqueName: \"kubernetes.io/projected/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-kube-api-access-ldrth\") pod \"barbican-api-d4854c55d-lxcp4\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.323890 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerStarted","Data":"4e3bfc32852ccd954b6ac2c11c64a61deac8d35642aa3d5587b1f0c4d6a396a7"} Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.324211 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.332046 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.347905 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.183286826 podStartE2EDuration="5.347890455s" podCreationTimestamp="2025-11-22 03:10:32 +0000 UTC" firstStartedPulling="2025-11-22 03:10:32.848666835 +0000 UTC m=+1068.887188737" lastFinishedPulling="2025-11-22 03:10:37.013270484 +0000 UTC m=+1073.051792366" observedRunningTime="2025-11-22 03:10:37.346401789 +0000 UTC m=+1073.384923691" watchObservedRunningTime="2025-11-22 03:10:37.347890455 +0000 UTC m=+1073.386412337" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.457187 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.477435 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/87ad9c28-b442-49bb-b474-005766f23004-config-data-custom\") pod \"barbican-keystone-listener-6b7d698c78-tqjxq\" (UID: \"87ad9c28-b442-49bb-b474-005766f23004\") " pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.488357 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-cf5c7c74c-qqjc7"] Nov 22 03:10:37 crc kubenswrapper[4922]: W1122 03:10:37.499861 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec603ae2_8cba_4c61_8733_b448f538780a.slice/crio-92ee70c0b5592104f3384ca1927fc14d984fbfabfc79ec6424a0a0008a86876e WatchSource:0}: Error finding container 92ee70c0b5592104f3384ca1927fc14d984fbfabfc79ec6424a0a0008a86876e: Status 404 returned error can't find the container with id 92ee70c0b5592104f3384ca1927fc14d984fbfabfc79ec6424a0a0008a86876e Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.579960 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.585497 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-6xcfl"] Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.852649 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b7d698c78-tqjxq"] Nov 22 03:10:37 crc kubenswrapper[4922]: I1122 03:10:37.871119 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d4854c55d-lxcp4"] Nov 22 03:10:37 crc kubenswrapper[4922]: W1122 03:10:37.891727 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f8129a_2a3c_4ee6_bd5f_e5f93cdd5457.slice/crio-6365742e1efb29d761d096e90058397968d03a52db4ce9b89ec0e86b3f7234ce WatchSource:0}: Error finding container 6365742e1efb29d761d096e90058397968d03a52db4ce9b89ec0e86b3f7234ce: Status 404 returned error can't find the container with id 6365742e1efb29d761d096e90058397968d03a52db4ce9b89ec0e86b3f7234ce Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.334375 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4854c55d-lxcp4" event={"ID":"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457","Type":"ContainerStarted","Data":"4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.334445 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4854c55d-lxcp4" event={"ID":"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457","Type":"ContainerStarted","Data":"e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.334460 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4854c55d-lxcp4" event={"ID":"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457","Type":"ContainerStarted","Data":"6365742e1efb29d761d096e90058397968d03a52db4ce9b89ec0e86b3f7234ce"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.334658 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.334695 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.336039 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" event={"ID":"87ad9c28-b442-49bb-b474-005766f23004","Type":"ContainerStarted","Data":"73798ec74df229921e45b7bd4f5d01e046e5869480334af087576a9efc5521ee"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.338759 4922 generic.go:334] "Generic (PLEG): container finished" podID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerID="48dca4276c26144a3b1cecec137bb554c8dd9a4bcd59f9881e9623a6b4690e51" exitCode=0 Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.338800 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" event={"ID":"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961","Type":"ContainerDied","Data":"48dca4276c26144a3b1cecec137bb554c8dd9a4bcd59f9881e9623a6b4690e51"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.338835 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" event={"ID":"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961","Type":"ContainerStarted","Data":"94b5dbd97f083f48d1c4b75f7271373f2e80371c98a3d69df264fee8afcd9e73"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.341227 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cf5c7c74c-qqjc7" event={"ID":"ec603ae2-8cba-4c61-8733-b448f538780a","Type":"ContainerStarted","Data":"92ee70c0b5592104f3384ca1927fc14d984fbfabfc79ec6424a0a0008a86876e"} Nov 22 03:10:38 crc kubenswrapper[4922]: I1122 03:10:38.356597 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d4854c55d-lxcp4" podStartSLOduration=2.356575845 podStartE2EDuration="2.356575845s" podCreationTimestamp="2025-11-22 03:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:38.349007853 +0000 UTC m=+1074.387529745" watchObservedRunningTime="2025-11-22 03:10:38.356575845 +0000 UTC m=+1074.395097737" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.116821 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-65c4db8978-gcb6d"] Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.118381 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.120611 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.120612 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.137245 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65c4db8978-gcb6d"] Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198107 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c94acf-643a-4c78-8d2d-525a0d9432cc-logs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198375 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqfl\" (UniqueName: \"kubernetes.io/projected/78c94acf-643a-4c78-8d2d-525a0d9432cc-kube-api-access-xrqfl\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198456 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-config-data\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198476 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-config-data-custom\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198795 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-public-tls-certs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-internal-tls-certs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.198972 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-combined-ca-bundle\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300030 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-internal-tls-certs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300113 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-combined-ca-bundle\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c94acf-643a-4c78-8d2d-525a0d9432cc-logs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300200 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqfl\" (UniqueName: \"kubernetes.io/projected/78c94acf-643a-4c78-8d2d-525a0d9432cc-kube-api-access-xrqfl\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300285 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-config-data\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300315 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-config-data-custom\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.300364 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-public-tls-certs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.303631 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c94acf-643a-4c78-8d2d-525a0d9432cc-logs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.308930 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-public-tls-certs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.314164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-internal-tls-certs\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.315607 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-config-data-custom\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.316818 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-config-data\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.319510 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqfl\" (UniqueName: \"kubernetes.io/projected/78c94acf-643a-4c78-8d2d-525a0d9432cc-kube-api-access-xrqfl\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.333472 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78c94acf-643a-4c78-8d2d-525a0d9432cc-combined-ca-bundle\") pod \"barbican-api-65c4db8978-gcb6d\" (UID: \"78c94acf-643a-4c78-8d2d-525a0d9432cc\") " pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.378372 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" event={"ID":"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961","Type":"ContainerStarted","Data":"1f2242e19de8d8a903b5b8ee00f2961be591707a1c4188b803f073e34e517863"} Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.378417 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.400645 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" podStartSLOduration=3.400628361 podStartE2EDuration="3.400628361s" podCreationTimestamp="2025-11-22 03:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:39.392436165 +0000 UTC m=+1075.430958057" watchObservedRunningTime="2025-11-22 03:10:39.400628361 +0000 UTC m=+1075.439150253" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.439212 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:39 crc kubenswrapper[4922]: I1122 03:10:39.949926 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65c4db8978-gcb6d"] Nov 22 03:10:39 crc kubenswrapper[4922]: W1122 03:10:39.968186 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c94acf_643a_4c78_8d2d_525a0d9432cc.slice/crio-3a116b22da998ec81fc818a8788c021c40905371afa26a2f800ff68217c946cf WatchSource:0}: Error finding container 3a116b22da998ec81fc818a8788c021c40905371afa26a2f800ff68217c946cf: Status 404 returned error can't find the container with id 3a116b22da998ec81fc818a8788c021c40905371afa26a2f800ff68217c946cf Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.389638 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" event={"ID":"87ad9c28-b442-49bb-b474-005766f23004","Type":"ContainerStarted","Data":"c12cc3550de4407ead10da713a777457c88aac2761d7e17b87470b6901be2c40"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.389678 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" event={"ID":"87ad9c28-b442-49bb-b474-005766f23004","Type":"ContainerStarted","Data":"e4ecba567b6ec94695f1b7ca4297dde0d9c4dcdeda9f1c2ca267cfeb4a2a31a2"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.391711 4922 generic.go:334] "Generic (PLEG): container finished" podID="06c56a5a-f992-43bb-bcd0-15f23d824242" containerID="0503a79cf0c11dd129c952e2c01f8d01603f051e8525895fa57b3995fef8239a" exitCode=0 Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.391752 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n6frp" event={"ID":"06c56a5a-f992-43bb-bcd0-15f23d824242","Type":"ContainerDied","Data":"0503a79cf0c11dd129c952e2c01f8d01603f051e8525895fa57b3995fef8239a"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.394441 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cf5c7c74c-qqjc7" event={"ID":"ec603ae2-8cba-4c61-8733-b448f538780a","Type":"ContainerStarted","Data":"66e6564fbc45a06f4ad90231c18b24542ec2ab9f03b1ae7ad1ce0c1da38b446c"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.394465 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-cf5c7c74c-qqjc7" event={"ID":"ec603ae2-8cba-4c61-8733-b448f538780a","Type":"ContainerStarted","Data":"11ef8def523c6d7392f00058160573e8a9c9fdfd04f2188a37b27272f61d1d9d"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.398201 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c4db8978-gcb6d" event={"ID":"78c94acf-643a-4c78-8d2d-525a0d9432cc","Type":"ContainerStarted","Data":"3fd21cb0c890055f0fcec941efa0f88d0ffd4e604968ce1e56061d1d54b76b03"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.398222 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c4db8978-gcb6d" event={"ID":"78c94acf-643a-4c78-8d2d-525a0d9432cc","Type":"ContainerStarted","Data":"f65f2ca11108134c69ffeb80a63f214cf6617f979967d0ddb0b6090f0659c90a"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.398231 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65c4db8978-gcb6d" event={"ID":"78c94acf-643a-4c78-8d2d-525a0d9432cc","Type":"ContainerStarted","Data":"3a116b22da998ec81fc818a8788c021c40905371afa26a2f800ff68217c946cf"} Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.398405 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.398472 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.426236 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b7d698c78-tqjxq" podStartSLOduration=2.847406159 podStartE2EDuration="4.426216765s" podCreationTimestamp="2025-11-22 03:10:36 +0000 UTC" firstStartedPulling="2025-11-22 03:10:37.885198455 +0000 UTC m=+1073.923720347" lastFinishedPulling="2025-11-22 03:10:39.464009061 +0000 UTC m=+1075.502530953" observedRunningTime="2025-11-22 03:10:40.405805636 +0000 UTC m=+1076.444327528" watchObservedRunningTime="2025-11-22 03:10:40.426216765 +0000 UTC m=+1076.464738647" Nov 22 03:10:40 crc kubenswrapper[4922]: I1122 03:10:40.445416 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-cf5c7c74c-qqjc7" podStartSLOduration=2.48941003 podStartE2EDuration="4.445398416s" podCreationTimestamp="2025-11-22 03:10:36 +0000 UTC" firstStartedPulling="2025-11-22 03:10:37.507045261 +0000 UTC m=+1073.545567153" lastFinishedPulling="2025-11-22 03:10:39.463033647 +0000 UTC m=+1075.501555539" observedRunningTime="2025-11-22 03:10:40.443740045 +0000 UTC m=+1076.482261937" watchObservedRunningTime="2025-11-22 03:10:40.445398416 +0000 UTC m=+1076.483920308" Nov 22 03:10:41 crc kubenswrapper[4922]: I1122 03:10:41.109189 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:10:41 crc kubenswrapper[4922]: I1122 03:10:41.109574 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:10:42 crc kubenswrapper[4922]: I1122 03:10:42.316403 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-65c4db8978-gcb6d" podStartSLOduration=3.316384754 podStartE2EDuration="3.316384754s" podCreationTimestamp="2025-11-22 03:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:40.46436788 +0000 UTC m=+1076.502889772" watchObservedRunningTime="2025-11-22 03:10:42.316384754 +0000 UTC m=+1078.354906646" Nov 22 03:10:44 crc kubenswrapper[4922]: I1122 03:10:44.917229 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-r2hv4"] Nov 22 03:10:44 crc kubenswrapper[4922]: I1122 03:10:44.919861 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:44 crc kubenswrapper[4922]: I1122 03:10:44.962451 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r2hv4"] Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.004537 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v648\" (UniqueName: \"kubernetes.io/projected/48b2290b-756b-4cbb-b1c1-19f32bdd6358-kube-api-access-5v648\") pod \"nova-api-db-create-r2hv4\" (UID: \"48b2290b-756b-4cbb-b1c1-19f32bdd6358\") " pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.009495 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-n89zb"] Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.019930 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n89zb"] Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.020034 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.106579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99wbx\" (UniqueName: \"kubernetes.io/projected/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003-kube-api-access-99wbx\") pod \"nova-cell0-db-create-n89zb\" (UID: \"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003\") " pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.106701 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v648\" (UniqueName: \"kubernetes.io/projected/48b2290b-756b-4cbb-b1c1-19f32bdd6358-kube-api-access-5v648\") pod \"nova-api-db-create-r2hv4\" (UID: \"48b2290b-756b-4cbb-b1c1-19f32bdd6358\") " pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.118760 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-8xvxh"] Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.119962 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.127126 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8xvxh"] Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.141017 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v648\" (UniqueName: \"kubernetes.io/projected/48b2290b-756b-4cbb-b1c1-19f32bdd6358-kube-api-access-5v648\") pod \"nova-api-db-create-r2hv4\" (UID: \"48b2290b-756b-4cbb-b1c1-19f32bdd6358\") " pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.208621 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99wbx\" (UniqueName: \"kubernetes.io/projected/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003-kube-api-access-99wbx\") pod \"nova-cell0-db-create-n89zb\" (UID: \"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003\") " pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.208759 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvxc8\" (UniqueName: \"kubernetes.io/projected/9d9bda50-78a2-45bf-a92a-f1085209b972-kube-api-access-cvxc8\") pod \"nova-cell1-db-create-8xvxh\" (UID: \"9d9bda50-78a2-45bf-a92a-f1085209b972\") " pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.229996 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99wbx\" (UniqueName: \"kubernetes.io/projected/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003-kube-api-access-99wbx\") pod \"nova-cell0-db-create-n89zb\" (UID: \"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003\") " pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.256990 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.315059 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvxc8\" (UniqueName: \"kubernetes.io/projected/9d9bda50-78a2-45bf-a92a-f1085209b972-kube-api-access-cvxc8\") pod \"nova-cell1-db-create-8xvxh\" (UID: \"9d9bda50-78a2-45bf-a92a-f1085209b972\") " pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.340681 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvxc8\" (UniqueName: \"kubernetes.io/projected/9d9bda50-78a2-45bf-a92a-f1085209b972-kube-api-access-cvxc8\") pod \"nova-cell1-db-create-8xvxh\" (UID: \"9d9bda50-78a2-45bf-a92a-f1085209b972\") " pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.344299 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:45 crc kubenswrapper[4922]: I1122 03:10:45.482996 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.048030 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.106566 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-ml4m5"] Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.106810 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" podUID="9257c931-7544-4f45-9589-e735e55bcca2" containerName="dnsmasq-dns" containerID="cri-o://e409962e9a0bec937cd7dcaf7f7f301719dfeb171b5ae9dea8c6f1369628d036" gracePeriod=10 Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.477309 4922 generic.go:334] "Generic (PLEG): container finished" podID="9257c931-7544-4f45-9589-e735e55bcca2" containerID="e409962e9a0bec937cd7dcaf7f7f301719dfeb171b5ae9dea8c6f1369628d036" exitCode=0 Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.477354 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" event={"ID":"9257c931-7544-4f45-9589-e735e55bcca2","Type":"ContainerDied","Data":"e409962e9a0bec937cd7dcaf7f7f301719dfeb171b5ae9dea8c6f1369628d036"} Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.839097 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n6frp" Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.974621 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-combined-ca-bundle\") pod \"06c56a5a-f992-43bb-bcd0-15f23d824242\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.974696 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzpr\" (UniqueName: \"kubernetes.io/projected/06c56a5a-f992-43bb-bcd0-15f23d824242-kube-api-access-pdzpr\") pod \"06c56a5a-f992-43bb-bcd0-15f23d824242\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.974793 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-config\") pod \"06c56a5a-f992-43bb-bcd0-15f23d824242\" (UID: \"06c56a5a-f992-43bb-bcd0-15f23d824242\") " Nov 22 03:10:47 crc kubenswrapper[4922]: I1122 03:10:47.996132 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c56a5a-f992-43bb-bcd0-15f23d824242-kube-api-access-pdzpr" (OuterVolumeSpecName: "kube-api-access-pdzpr") pod "06c56a5a-f992-43bb-bcd0-15f23d824242" (UID: "06c56a5a-f992-43bb-bcd0-15f23d824242"). InnerVolumeSpecName "kube-api-access-pdzpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.034593 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06c56a5a-f992-43bb-bcd0-15f23d824242" (UID: "06c56a5a-f992-43bb-bcd0-15f23d824242"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.035756 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-config" (OuterVolumeSpecName: "config") pod "06c56a5a-f992-43bb-bcd0-15f23d824242" (UID: "06c56a5a-f992-43bb-bcd0-15f23d824242"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.076431 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.076472 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c56a5a-f992-43bb-bcd0-15f23d824242-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.076487 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzpr\" (UniqueName: \"kubernetes.io/projected/06c56a5a-f992-43bb-bcd0-15f23d824242-kube-api-access-pdzpr\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.193506 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.279241 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-config\") pod \"9257c931-7544-4f45-9589-e735e55bcca2\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.279345 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfw7j\" (UniqueName: \"kubernetes.io/projected/9257c931-7544-4f45-9589-e735e55bcca2-kube-api-access-gfw7j\") pod \"9257c931-7544-4f45-9589-e735e55bcca2\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.279397 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-nb\") pod \"9257c931-7544-4f45-9589-e735e55bcca2\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.279422 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-sb\") pod \"9257c931-7544-4f45-9589-e735e55bcca2\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.279486 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-dns-svc\") pod \"9257c931-7544-4f45-9589-e735e55bcca2\" (UID: \"9257c931-7544-4f45-9589-e735e55bcca2\") " Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.289262 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9257c931-7544-4f45-9589-e735e55bcca2-kube-api-access-gfw7j" (OuterVolumeSpecName: "kube-api-access-gfw7j") pod "9257c931-7544-4f45-9589-e735e55bcca2" (UID: "9257c931-7544-4f45-9589-e735e55bcca2"). InnerVolumeSpecName "kube-api-access-gfw7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.364370 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.364869 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-central-agent" containerID="cri-o://8cdbee1e9de52dd5ff6b7cae597cee93d0f78826a24525c8ba7d48eaf51dbf56" gracePeriod=30 Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.365491 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="proxy-httpd" containerID="cri-o://4e3bfc32852ccd954b6ac2c11c64a61deac8d35642aa3d5587b1f0c4d6a396a7" gracePeriod=30 Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.365547 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="sg-core" containerID="cri-o://27aea154443df043d91fded0cd22b21faee6a9cd9ede92d80ad1e5366d272bfc" gracePeriod=30 Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.365579 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-notification-agent" containerID="cri-o://e678fba9c22764baad2f0b3dab1c3f45fe6a8987c0b86bb04c111f90c493dda9" gracePeriod=30 Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.368397 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9257c931-7544-4f45-9589-e735e55bcca2" (UID: "9257c931-7544-4f45-9589-e735e55bcca2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.372642 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-config" (OuterVolumeSpecName: "config") pod "9257c931-7544-4f45-9589-e735e55bcca2" (UID: "9257c931-7544-4f45-9589-e735e55bcca2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.375801 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9257c931-7544-4f45-9589-e735e55bcca2" (UID: "9257c931-7544-4f45-9589-e735e55bcca2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.376381 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9257c931-7544-4f45-9589-e735e55bcca2" (UID: "9257c931-7544-4f45-9589-e735e55bcca2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.378153 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.386545 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.386746 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfw7j\" (UniqueName: \"kubernetes.io/projected/9257c931-7544-4f45-9589-e735e55bcca2-kube-api-access-gfw7j\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.386835 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.386910 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.386965 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9257c931-7544-4f45-9589-e735e55bcca2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.462652 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-r2hv4"] Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.472505 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-8xvxh"] Nov 22 03:10:48 crc kubenswrapper[4922]: W1122 03:10:48.488202 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f5c234_6ae0_451b_bdca_3d6ba5f3a003.slice/crio-d3ef86a4f0edd456ab863a75ffa6482ba262f7ccb6d889c7d9a46ecc88528d5f WatchSource:0}: Error finding container d3ef86a4f0edd456ab863a75ffa6482ba262f7ccb6d889c7d9a46ecc88528d5f: Status 404 returned error can't find the container with id d3ef86a4f0edd456ab863a75ffa6482ba262f7ccb6d889c7d9a46ecc88528d5f Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.494937 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-n89zb"] Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.499646 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8d23126d-97a4-4ed0-a589-0ef607e832ed","Type":"ContainerStarted","Data":"5f5fd2ab7465c1ab4c0333029c5f6474f7c2e1122107eba551ec969a42428414"} Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.511702 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n6frp" event={"ID":"06c56a5a-f992-43bb-bcd0-15f23d824242","Type":"ContainerDied","Data":"1c0f1eb3efd10daf68cf9542ad9b19f15cf40a7fddaa8f1be85c511768c46140"} Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.511973 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0f1eb3efd10daf68cf9542ad9b19f15cf40a7fddaa8f1be85c511768c46140" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.512041 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n6frp" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.519859 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.872844196 podStartE2EDuration="13.519824405s" podCreationTimestamp="2025-11-22 03:10:35 +0000 UTC" firstStartedPulling="2025-11-22 03:10:36.230237984 +0000 UTC m=+1072.268759886" lastFinishedPulling="2025-11-22 03:10:47.877218213 +0000 UTC m=+1083.915740095" observedRunningTime="2025-11-22 03:10:48.517166643 +0000 UTC m=+1084.555688535" watchObservedRunningTime="2025-11-22 03:10:48.519824405 +0000 UTC m=+1084.558346297" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.537563 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" event={"ID":"9257c931-7544-4f45-9589-e735e55bcca2","Type":"ContainerDied","Data":"2f8e55a16367ecd0b3b86635bc6ceb18369593821adf8bc6ffe259febb45bfe2"} Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.537648 4922 scope.go:117] "RemoveContainer" containerID="e409962e9a0bec937cd7dcaf7f7f301719dfeb171b5ae9dea8c6f1369628d036" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.537828 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-ml4m5" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.552713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r2hv4" event={"ID":"48b2290b-756b-4cbb-b1c1-19f32bdd6358","Type":"ContainerStarted","Data":"848a5c652a2980bcbaa9650284b1904d0dbb46df4c83f8074b6be13e2686f344"} Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.605274 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-ml4m5"] Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.609708 4922 scope.go:117] "RemoveContainer" containerID="b87b0217af83e867eaf23471484bd500ff4aad293958784ad10e7484611da195" Nov 22 03:10:48 crc kubenswrapper[4922]: I1122 03:10:48.621329 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-ml4m5"] Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.146432 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-j5v97"] Nov 22 03:10:49 crc kubenswrapper[4922]: E1122 03:10:49.147016 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9257c931-7544-4f45-9589-e735e55bcca2" containerName="init" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.147030 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9257c931-7544-4f45-9589-e735e55bcca2" containerName="init" Nov 22 03:10:49 crc kubenswrapper[4922]: E1122 03:10:49.147055 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9257c931-7544-4f45-9589-e735e55bcca2" containerName="dnsmasq-dns" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.147061 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9257c931-7544-4f45-9589-e735e55bcca2" containerName="dnsmasq-dns" Nov 22 03:10:49 crc kubenswrapper[4922]: E1122 03:10:49.147081 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c56a5a-f992-43bb-bcd0-15f23d824242" containerName="neutron-db-sync" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.147088 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c56a5a-f992-43bb-bcd0-15f23d824242" containerName="neutron-db-sync" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.147228 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c56a5a-f992-43bb-bcd0-15f23d824242" containerName="neutron-db-sync" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.147241 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9257c931-7544-4f45-9589-e735e55bcca2" containerName="dnsmasq-dns" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.155590 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.156330 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.170433 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-j5v97"] Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.206497 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f7d8bcbcb-r2n98"] Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.212680 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.216015 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.216071 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.216211 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nztrr" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.216256 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.229862 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f7d8bcbcb-r2n98"] Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.259482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-config\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.259527 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.259554 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-dns-svc\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.259611 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwv8q\" (UniqueName: \"kubernetes.io/projected/0d1e5299-5db6-4251-ad4a-c5a1a137035a-kube-api-access-pwv8q\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.259751 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.319799 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9257c931-7544-4f45-9589-e735e55bcca2" path="/var/lib/kubelet/pods/9257c931-7544-4f45-9589-e735e55bcca2/volumes" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.360980 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-config\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.361044 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.361248 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-dns-svc\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.361288 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-combined-ca-bundle\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.361978 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-config\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362113 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362189 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-config\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362497 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-dns-svc\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362226 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwv8q\" (UniqueName: \"kubernetes.io/projected/0d1e5299-5db6-4251-ad4a-c5a1a137035a-kube-api-access-pwv8q\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-ovndb-tls-certs\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-httpd-config\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.362954 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.363603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcs28\" (UniqueName: \"kubernetes.io/projected/36feb788-6145-4d72-b9b2-93a3557704b4-kube-api-access-bcs28\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.363554 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.381128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwv8q\" (UniqueName: \"kubernetes.io/projected/0d1e5299-5db6-4251-ad4a-c5a1a137035a-kube-api-access-pwv8q\") pod \"dnsmasq-dns-6bb684768f-j5v97\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.464572 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-combined-ca-bundle\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.465884 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-config\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.466264 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-ovndb-tls-certs\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.466345 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-httpd-config\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.466391 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcs28\" (UniqueName: \"kubernetes.io/projected/36feb788-6145-4d72-b9b2-93a3557704b4-kube-api-access-bcs28\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.471082 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-combined-ca-bundle\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.474618 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-ovndb-tls-certs\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.483778 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-httpd-config\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.484422 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-config\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.489572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcs28\" (UniqueName: \"kubernetes.io/projected/36feb788-6145-4d72-b9b2-93a3557704b4-kube-api-access-bcs28\") pod \"neutron-f7d8bcbcb-r2n98\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.496215 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.502390 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.566097 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.605283 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgvlf" event={"ID":"f256e75d-5ff4-4804-bbe6-058ef24fab04","Type":"ContainerStarted","Data":"f33b53f139dbde89a912e9be4386086bd325c48d41bd800a6e12e7136709eb46"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.621251 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n89zb" event={"ID":"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003","Type":"ContainerStarted","Data":"1dc2dd7a65399cf1389ecb4193671302a4e8054c71d779ca62ba14a2803d2a12"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.621331 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n89zb" event={"ID":"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003","Type":"ContainerStarted","Data":"d3ef86a4f0edd456ab863a75ffa6482ba262f7ccb6d889c7d9a46ecc88528d5f"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.654716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r2hv4" event={"ID":"48b2290b-756b-4cbb-b1c1-19f32bdd6358","Type":"ContainerStarted","Data":"e3bc6067184f3efb2a84330cb75a18afe157fb2750e59786630af62bfcae138b"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.706580 4922 generic.go:334] "Generic (PLEG): container finished" podID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerID="4e3bfc32852ccd954b6ac2c11c64a61deac8d35642aa3d5587b1f0c4d6a396a7" exitCode=0 Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.709588 4922 generic.go:334] "Generic (PLEG): container finished" podID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerID="27aea154443df043d91fded0cd22b21faee6a9cd9ede92d80ad1e5366d272bfc" exitCode=2 Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.709607 4922 generic.go:334] "Generic (PLEG): container finished" podID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerID="8cdbee1e9de52dd5ff6b7cae597cee93d0f78826a24525c8ba7d48eaf51dbf56" exitCode=0 Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.709773 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerDied","Data":"4e3bfc32852ccd954b6ac2c11c64a61deac8d35642aa3d5587b1f0c4d6a396a7"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.709808 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerDied","Data":"27aea154443df043d91fded0cd22b21faee6a9cd9ede92d80ad1e5366d272bfc"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.709818 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerDied","Data":"8cdbee1e9de52dd5ff6b7cae597cee93d0f78826a24525c8ba7d48eaf51dbf56"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.725668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8xvxh" event={"ID":"9d9bda50-78a2-45bf-a92a-f1085209b972","Type":"ContainerStarted","Data":"4302cfd328a00636597ce7a7e73f4d00264193f316be0e7e3972893870af3c7f"} Nov 22 03:10:49 crc kubenswrapper[4922]: I1122 03:10:49.725700 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8xvxh" event={"ID":"9d9bda50-78a2-45bf-a92a-f1085209b972","Type":"ContainerStarted","Data":"fb0f94c22a563dbaebe3772524e4127daa21b31cbf970a11ae2fa3d4511346cb"} Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.175990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-j5v97"] Nov 22 03:10:50 crc kubenswrapper[4922]: W1122 03:10:50.184717 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d1e5299_5db6_4251_ad4a_c5a1a137035a.slice/crio-d7d8323ddcf9e785723ca7869a60f5d16a45800adffdf5c51ed7a35db6f21484 WatchSource:0}: Error finding container d7d8323ddcf9e785723ca7869a60f5d16a45800adffdf5c51ed7a35db6f21484: Status 404 returned error can't find the container with id d7d8323ddcf9e785723ca7869a60f5d16a45800adffdf5c51ed7a35db6f21484 Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.724428 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f7d8bcbcb-r2n98"] Nov 22 03:10:50 crc kubenswrapper[4922]: W1122 03:10:50.724833 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36feb788_6145_4d72_b9b2_93a3557704b4.slice/crio-8e66c307ac52c33adef1f5de4a52b8e87c210833524826b874fec3fa011179f8 WatchSource:0}: Error finding container 8e66c307ac52c33adef1f5de4a52b8e87c210833524826b874fec3fa011179f8: Status 404 returned error can't find the container with id 8e66c307ac52c33adef1f5de4a52b8e87c210833524826b874fec3fa011179f8 Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.759399 4922 generic.go:334] "Generic (PLEG): container finished" podID="9d9bda50-78a2-45bf-a92a-f1085209b972" containerID="4302cfd328a00636597ce7a7e73f4d00264193f316be0e7e3972893870af3c7f" exitCode=0 Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.760016 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8xvxh" event={"ID":"9d9bda50-78a2-45bf-a92a-f1085209b972","Type":"ContainerDied","Data":"4302cfd328a00636597ce7a7e73f4d00264193f316be0e7e3972893870af3c7f"} Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.773912 4922 generic.go:334] "Generic (PLEG): container finished" podID="f1f5c234-6ae0-451b-bdca-3d6ba5f3a003" containerID="1dc2dd7a65399cf1389ecb4193671302a4e8054c71d779ca62ba14a2803d2a12" exitCode=0 Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.774019 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n89zb" event={"ID":"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003","Type":"ContainerDied","Data":"1dc2dd7a65399cf1389ecb4193671302a4e8054c71d779ca62ba14a2803d2a12"} Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.784811 4922 generic.go:334] "Generic (PLEG): container finished" podID="48b2290b-756b-4cbb-b1c1-19f32bdd6358" containerID="e3bc6067184f3efb2a84330cb75a18afe157fb2750e59786630af62bfcae138b" exitCode=0 Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.784933 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r2hv4" event={"ID":"48b2290b-756b-4cbb-b1c1-19f32bdd6358","Type":"ContainerDied","Data":"e3bc6067184f3efb2a84330cb75a18afe157fb2750e59786630af62bfcae138b"} Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.792761 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" event={"ID":"0d1e5299-5db6-4251-ad4a-c5a1a137035a","Type":"ContainerDied","Data":"73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf"} Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.802925 4922 generic.go:334] "Generic (PLEG): container finished" podID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerID="73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf" exitCode=0 Nov 22 03:10:50 crc kubenswrapper[4922]: I1122 03:10:50.809898 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" event={"ID":"0d1e5299-5db6-4251-ad4a-c5a1a137035a","Type":"ContainerStarted","Data":"d7d8323ddcf9e785723ca7869a60f5d16a45800adffdf5c51ed7a35db6f21484"} Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.773169 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.799205 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xgvlf" podStartSLOduration=5.253513822 podStartE2EDuration="56.799189825s" podCreationTimestamp="2025-11-22 03:09:55 +0000 UTC" firstStartedPulling="2025-11-22 03:09:56.556054071 +0000 UTC m=+1032.594575963" lastFinishedPulling="2025-11-22 03:10:48.101730074 +0000 UTC m=+1084.140251966" observedRunningTime="2025-11-22 03:10:50.872085942 +0000 UTC m=+1086.910607824" watchObservedRunningTime="2025-11-22 03:10:51.799189825 +0000 UTC m=+1087.837711717" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.819505 4922 generic.go:334] "Generic (PLEG): container finished" podID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerID="e678fba9c22764baad2f0b3dab1c3f45fe6a8987c0b86bb04c111f90c493dda9" exitCode=0 Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.819561 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerDied","Data":"e678fba9c22764baad2f0b3dab1c3f45fe6a8987c0b86bb04c111f90c493dda9"} Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.823177 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7d8bcbcb-r2n98" event={"ID":"36feb788-6145-4d72-b9b2-93a3557704b4","Type":"ContainerStarted","Data":"4539b8421ec795772c4e11556c6d2b3032d0278527c4976eb1c1092a96a32a4e"} Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.823214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7d8bcbcb-r2n98" event={"ID":"36feb788-6145-4d72-b9b2-93a3557704b4","Type":"ContainerStarted","Data":"cf91d2200e85eb82afcdd14f2637345a4eea69b297682bfff35a58a992b5c444"} Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.823223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7d8bcbcb-r2n98" event={"ID":"36feb788-6145-4d72-b9b2-93a3557704b4","Type":"ContainerStarted","Data":"8e66c307ac52c33adef1f5de4a52b8e87c210833524826b874fec3fa011179f8"} Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.823537 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.832033 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" event={"ID":"0d1e5299-5db6-4251-ad4a-c5a1a137035a","Type":"ContainerStarted","Data":"2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551"} Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.832069 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.885722 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" podStartSLOduration=2.885707118 podStartE2EDuration="2.885707118s" podCreationTimestamp="2025-11-22 03:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:51.882563633 +0000 UTC m=+1087.921085525" watchObservedRunningTime="2025-11-22 03:10:51.885707118 +0000 UTC m=+1087.924229010" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.887825 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f7d8bcbcb-r2n98" podStartSLOduration=2.887818959 podStartE2EDuration="2.887818959s" podCreationTimestamp="2025-11-22 03:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:51.848824725 +0000 UTC m=+1087.887346637" watchObservedRunningTime="2025-11-22 03:10:51.887818959 +0000 UTC m=+1087.926340851" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.944346 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-655bcfccf7-54vbt"] Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.947083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.962045 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.962335 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 22 03:10:51 crc kubenswrapper[4922]: I1122 03:10:51.965153 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-655bcfccf7-54vbt"] Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.026720 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-ovndb-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.026787 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-public-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.026835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-config\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.026889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-combined-ca-bundle\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.027714 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpcfc\" (UniqueName: \"kubernetes.io/projected/bb592899-6bd7-4d6b-a54d-132c5166df85-kube-api-access-qpcfc\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.027737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-httpd-config\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.027754 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-internal-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.089171 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65c4db8978-gcb6d" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131002 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpcfc\" (UniqueName: \"kubernetes.io/projected/bb592899-6bd7-4d6b-a54d-132c5166df85-kube-api-access-qpcfc\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-httpd-config\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131067 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-internal-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131104 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-ovndb-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131138 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-public-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-config\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.131206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-combined-ca-bundle\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.133701 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.136442 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-combined-ca-bundle\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.138419 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-public-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.145152 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-httpd-config\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.148281 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-ovndb-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.149214 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-config\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.159100 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb592899-6bd7-4d6b-a54d-132c5166df85-internal-tls-certs\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.172725 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpcfc\" (UniqueName: \"kubernetes.io/projected/bb592899-6bd7-4d6b-a54d-132c5166df85-kube-api-access-qpcfc\") pod \"neutron-655bcfccf7-54vbt\" (UID: \"bb592899-6bd7-4d6b-a54d-132c5166df85\") " pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.192802 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d4854c55d-lxcp4"] Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.193161 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d4854c55d-lxcp4" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api-log" containerID="cri-o://e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c" gracePeriod=30 Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.193208 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d4854c55d-lxcp4" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api" containerID="cri-o://4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae" gracePeriod=30 Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232591 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-log-httpd\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232636 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-config-data\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232707 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-run-httpd\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232731 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-sg-core-conf-yaml\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232778 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-combined-ca-bundle\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232794 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-scripts\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.232889 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9kdf\" (UniqueName: \"kubernetes.io/projected/4169244c-c975-4716-a7f1-53bf5c0dafe6-kube-api-access-f9kdf\") pod \"4169244c-c975-4716-a7f1-53bf5c0dafe6\" (UID: \"4169244c-c975-4716-a7f1-53bf5c0dafe6\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.233055 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.233286 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.238305 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.240459 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-scripts" (OuterVolumeSpecName: "scripts") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.243130 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4169244c-c975-4716-a7f1-53bf5c0dafe6-kube-api-access-f9kdf" (OuterVolumeSpecName: "kube-api-access-f9kdf") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "kube-api-access-f9kdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.307233 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.336710 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4169244c-c975-4716-a7f1-53bf5c0dafe6-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.336760 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.336776 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9kdf\" (UniqueName: \"kubernetes.io/projected/4169244c-c975-4716-a7f1-53bf5c0dafe6-kube-api-access-f9kdf\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.366984 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.371390 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.393326 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.439317 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.439345 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.467159 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-config-data" (OuterVolumeSpecName: "config-data") pod "4169244c-c975-4716-a7f1-53bf5c0dafe6" (UID: "4169244c-c975-4716-a7f1-53bf5c0dafe6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.517934 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.529929 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.540724 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v648\" (UniqueName: \"kubernetes.io/projected/48b2290b-756b-4cbb-b1c1-19f32bdd6358-kube-api-access-5v648\") pod \"48b2290b-756b-4cbb-b1c1-19f32bdd6358\" (UID: \"48b2290b-756b-4cbb-b1c1-19f32bdd6358\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.541316 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4169244c-c975-4716-a7f1-53bf5c0dafe6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.558109 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b2290b-756b-4cbb-b1c1-19f32bdd6358-kube-api-access-5v648" (OuterVolumeSpecName: "kube-api-access-5v648") pod "48b2290b-756b-4cbb-b1c1-19f32bdd6358" (UID: "48b2290b-756b-4cbb-b1c1-19f32bdd6358"). InnerVolumeSpecName "kube-api-access-5v648". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.643152 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99wbx\" (UniqueName: \"kubernetes.io/projected/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003-kube-api-access-99wbx\") pod \"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003\" (UID: \"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.643338 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvxc8\" (UniqueName: \"kubernetes.io/projected/9d9bda50-78a2-45bf-a92a-f1085209b972-kube-api-access-cvxc8\") pod \"9d9bda50-78a2-45bf-a92a-f1085209b972\" (UID: \"9d9bda50-78a2-45bf-a92a-f1085209b972\") " Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.643827 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v648\" (UniqueName: \"kubernetes.io/projected/48b2290b-756b-4cbb-b1c1-19f32bdd6358-kube-api-access-5v648\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.648252 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d9bda50-78a2-45bf-a92a-f1085209b972-kube-api-access-cvxc8" (OuterVolumeSpecName: "kube-api-access-cvxc8") pod "9d9bda50-78a2-45bf-a92a-f1085209b972" (UID: "9d9bda50-78a2-45bf-a92a-f1085209b972"). InnerVolumeSpecName "kube-api-access-cvxc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.649168 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003-kube-api-access-99wbx" (OuterVolumeSpecName: "kube-api-access-99wbx") pod "f1f5c234-6ae0-451b-bdca-3d6ba5f3a003" (UID: "f1f5c234-6ae0-451b-bdca-3d6ba5f3a003"). InnerVolumeSpecName "kube-api-access-99wbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.744944 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvxc8\" (UniqueName: \"kubernetes.io/projected/9d9bda50-78a2-45bf-a92a-f1085209b972-kube-api-access-cvxc8\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.745416 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99wbx\" (UniqueName: \"kubernetes.io/projected/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003-kube-api-access-99wbx\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.856671 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-8xvxh" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.856668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-8xvxh" event={"ID":"9d9bda50-78a2-45bf-a92a-f1085209b972","Type":"ContainerDied","Data":"fb0f94c22a563dbaebe3772524e4127daa21b31cbf970a11ae2fa3d4511346cb"} Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.856746 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb0f94c22a563dbaebe3772524e4127daa21b31cbf970a11ae2fa3d4511346cb" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.858450 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-n89zb" event={"ID":"f1f5c234-6ae0-451b-bdca-3d6ba5f3a003","Type":"ContainerDied","Data":"d3ef86a4f0edd456ab863a75ffa6482ba262f7ccb6d889c7d9a46ecc88528d5f"} Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.858477 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3ef86a4f0edd456ab863a75ffa6482ba262f7ccb6d889c7d9a46ecc88528d5f" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.858506 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-n89zb" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.867110 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-r2hv4" event={"ID":"48b2290b-756b-4cbb-b1c1-19f32bdd6358","Type":"ContainerDied","Data":"848a5c652a2980bcbaa9650284b1904d0dbb46df4c83f8074b6be13e2686f344"} Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.867135 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="848a5c652a2980bcbaa9650284b1904d0dbb46df4c83f8074b6be13e2686f344" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.867230 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-r2hv4" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.874553 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4169244c-c975-4716-a7f1-53bf5c0dafe6","Type":"ContainerDied","Data":"6c994719b564d129987dd61b1e16323b935997aa174fe1f8e38682a176d3b849"} Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.874617 4922 scope.go:117] "RemoveContainer" containerID="4e3bfc32852ccd954b6ac2c11c64a61deac8d35642aa3d5587b1f0c4d6a396a7" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.874749 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.882733 4922 generic.go:334] "Generic (PLEG): container finished" podID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerID="e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c" exitCode=143 Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.910122 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4854c55d-lxcp4" event={"ID":"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457","Type":"ContainerDied","Data":"e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c"} Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.938290 4922 scope.go:117] "RemoveContainer" containerID="27aea154443df043d91fded0cd22b21faee6a9cd9ede92d80ad1e5366d272bfc" Nov 22 03:10:52 crc kubenswrapper[4922]: I1122 03:10:52.979283 4922 scope.go:117] "RemoveContainer" containerID="e678fba9c22764baad2f0b3dab1c3f45fe6a8987c0b86bb04c111f90c493dda9" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.001557 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.019733 4922 scope.go:117] "RemoveContainer" containerID="8cdbee1e9de52dd5ff6b7cae597cee93d0f78826a24525c8ba7d48eaf51dbf56" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.048195 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060100 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060498 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-central-agent" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060512 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-central-agent" Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060529 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="sg-core" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060536 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="sg-core" Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060551 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-notification-agent" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060558 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-notification-agent" Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060572 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b2290b-756b-4cbb-b1c1-19f32bdd6358" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060579 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b2290b-756b-4cbb-b1c1-19f32bdd6358" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060589 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d9bda50-78a2-45bf-a92a-f1085209b972" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060596 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d9bda50-78a2-45bf-a92a-f1085209b972" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060607 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f5c234-6ae0-451b-bdca-3d6ba5f3a003" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060613 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f5c234-6ae0-451b-bdca-3d6ba5f3a003" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: E1122 03:10:53.060624 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="proxy-httpd" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060629 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="proxy-httpd" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060801 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d9bda50-78a2-45bf-a92a-f1085209b972" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060818 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-notification-agent" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060827 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f5c234-6ae0-451b-bdca-3d6ba5f3a003" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060928 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="sg-core" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060941 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="proxy-httpd" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060950 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b2290b-756b-4cbb-b1c1-19f32bdd6358" containerName="mariadb-database-create" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.060959 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" containerName="ceilometer-central-agent" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.062584 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.066233 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.066425 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.077972 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.254868 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-config-data\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.254926 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hbj\" (UniqueName: \"kubernetes.io/projected/9843715c-f3c1-4c18-af9e-545d5fa5da4e-kube-api-access-65hbj\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.254947 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-run-httpd\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.254987 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.255047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.255072 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-log-httpd\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.255097 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-scripts\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.311037 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4169244c-c975-4716-a7f1-53bf5c0dafe6" path="/var/lib/kubelet/pods/4169244c-c975-4716-a7f1-53bf5c0dafe6/volumes" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356331 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356390 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-log-httpd\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356422 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-scripts\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-config-data\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356503 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hbj\" (UniqueName: \"kubernetes.io/projected/9843715c-f3c1-4c18-af9e-545d5fa5da4e-kube-api-access-65hbj\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356525 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-run-httpd\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356559 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356862 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-log-httpd\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.356897 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-run-httpd\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.361182 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-scripts\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.361308 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.361414 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.363103 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-config-data\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.371919 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hbj\" (UniqueName: \"kubernetes.io/projected/9843715c-f3c1-4c18-af9e-545d5fa5da4e-kube-api-access-65hbj\") pod \"ceilometer-0\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.402159 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.582181 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-655bcfccf7-54vbt"] Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.848351 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.895970 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerStarted","Data":"455db90142fa5b0ee7bc33f5ce1076dbf7fc0d1515cce6bb8e3177d8a7d3b3c8"} Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.897477 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655bcfccf7-54vbt" event={"ID":"bb592899-6bd7-4d6b-a54d-132c5166df85","Type":"ContainerStarted","Data":"ab69b8a64a7c18a772ead106d2dbf30080f89f379ef97178e8b03d10afdf4c37"} Nov 22 03:10:53 crc kubenswrapper[4922]: I1122 03:10:53.897520 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655bcfccf7-54vbt" event={"ID":"bb592899-6bd7-4d6b-a54d-132c5166df85","Type":"ContainerStarted","Data":"9483c35f831db3a54f1f0db0c4b527424464892e37b16d9c5c42488229e33db4"} Nov 22 03:10:54 crc kubenswrapper[4922]: I1122 03:10:54.923154 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655bcfccf7-54vbt" event={"ID":"bb592899-6bd7-4d6b-a54d-132c5166df85","Type":"ContainerStarted","Data":"48205477144ff402308ed8669878a42ea38c47484d1f402d8d56cf0567d5737e"} Nov 22 03:10:54 crc kubenswrapper[4922]: I1122 03:10:54.923751 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:10:54 crc kubenswrapper[4922]: I1122 03:10:54.926463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerStarted","Data":"937e59c825e19e64051ec5abfabd8307d3c56110aec5294159e69b93b3e794ab"} Nov 22 03:10:54 crc kubenswrapper[4922]: I1122 03:10:54.953800 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-655bcfccf7-54vbt" podStartSLOduration=3.953781893 podStartE2EDuration="3.953781893s" podCreationTimestamp="2025-11-22 03:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:10:54.940181097 +0000 UTC m=+1090.978703029" watchObservedRunningTime="2025-11-22 03:10:54.953781893 +0000 UTC m=+1090.992303785" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.360908 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4854c55d-lxcp4" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:47704->10.217.0.146:9311: read: connection reset by peer" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.361787 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4854c55d-lxcp4" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:47706->10.217.0.146:9311: read: connection reset by peer" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.762774 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.800837 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldrth\" (UniqueName: \"kubernetes.io/projected/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-kube-api-access-ldrth\") pod \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.800904 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-combined-ca-bundle\") pod \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.800933 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-logs\") pod \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.800965 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data-custom\") pod \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.801041 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data\") pod \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\" (UID: \"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457\") " Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.805204 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-logs" (OuterVolumeSpecName: "logs") pod "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" (UID: "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.816375 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" (UID: "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.826050 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-kube-api-access-ldrth" (OuterVolumeSpecName: "kube-api-access-ldrth") pod "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" (UID: "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457"). InnerVolumeSpecName "kube-api-access-ldrth". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.862195 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" (UID: "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.878972 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data" (OuterVolumeSpecName: "config-data") pod "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" (UID: "12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.902728 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.902763 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldrth\" (UniqueName: \"kubernetes.io/projected/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-kube-api-access-ldrth\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.902773 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.902781 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.902791 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.943694 4922 generic.go:334] "Generic (PLEG): container finished" podID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerID="4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae" exitCode=0 Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.943771 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4854c55d-lxcp4" event={"ID":"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457","Type":"ContainerDied","Data":"4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae"} Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.943800 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4854c55d-lxcp4" event={"ID":"12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457","Type":"ContainerDied","Data":"6365742e1efb29d761d096e90058397968d03a52db4ce9b89ec0e86b3f7234ce"} Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.943825 4922 scope.go:117] "RemoveContainer" containerID="4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.943969 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4854c55d-lxcp4" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.957214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerStarted","Data":"560a4a5c608d011b99a869b7235b014e760215178450ddc88e09b47d44cafd62"} Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.972035 4922 scope.go:117] "RemoveContainer" containerID="e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c" Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.989980 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d4854c55d-lxcp4"] Nov 22 03:10:55 crc kubenswrapper[4922]: I1122 03:10:55.997170 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d4854c55d-lxcp4"] Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.003693 4922 scope.go:117] "RemoveContainer" containerID="4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae" Nov 22 03:10:56 crc kubenswrapper[4922]: E1122 03:10:56.006329 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae\": container with ID starting with 4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae not found: ID does not exist" containerID="4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae" Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.006373 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae"} err="failed to get container status \"4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae\": rpc error: code = NotFound desc = could not find container \"4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae\": container with ID starting with 4c4164c690afcf9e5dce1ffaf67b92dba080606952e9c0f6d32358a7785a7bae not found: ID does not exist" Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.006400 4922 scope.go:117] "RemoveContainer" containerID="e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c" Nov 22 03:10:56 crc kubenswrapper[4922]: E1122 03:10:56.010019 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c\": container with ID starting with e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c not found: ID does not exist" containerID="e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c" Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.010066 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c"} err="failed to get container status \"e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c\": rpc error: code = NotFound desc = could not find container \"e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c\": container with ID starting with e6314bd26575a50c0553d8e9a0840ed56152bbda9246770f86463bdc8984421c not found: ID does not exist" Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.980564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerStarted","Data":"84ce9ada924799311a337f4bda8399899f52d19b2ca775f0910be28158b2507a"} Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.996654 4922 generic.go:334] "Generic (PLEG): container finished" podID="f256e75d-5ff4-4804-bbe6-058ef24fab04" containerID="f33b53f139dbde89a912e9be4386086bd325c48d41bd800a6e12e7136709eb46" exitCode=0 Nov 22 03:10:56 crc kubenswrapper[4922]: I1122 03:10:56.996734 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgvlf" event={"ID":"f256e75d-5ff4-4804-bbe6-058ef24fab04","Type":"ContainerDied","Data":"f33b53f139dbde89a912e9be4386086bd325c48d41bd800a6e12e7136709eb46"} Nov 22 03:10:57 crc kubenswrapper[4922]: I1122 03:10:57.310629 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" path="/var/lib/kubelet/pods/12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457/volumes" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.014027 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerStarted","Data":"a65d54a3937659b5a58f9c195d7200c68b24faf8a7390d01c3fc06be0aa43812"} Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.014484 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.052808 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.283737661 podStartE2EDuration="5.052788529s" podCreationTimestamp="2025-11-22 03:10:53 +0000 UTC" firstStartedPulling="2025-11-22 03:10:53.851723406 +0000 UTC m=+1089.890245298" lastFinishedPulling="2025-11-22 03:10:57.620774274 +0000 UTC m=+1093.659296166" observedRunningTime="2025-11-22 03:10:58.04744361 +0000 UTC m=+1094.085965502" watchObservedRunningTime="2025-11-22 03:10:58.052788529 +0000 UTC m=+1094.091310441" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.377871 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f256e75d-5ff4-4804-bbe6-058ef24fab04-etc-machine-id\") pod \"f256e75d-5ff4-4804-bbe6-058ef24fab04\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544185 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-combined-ca-bundle\") pod \"f256e75d-5ff4-4804-bbe6-058ef24fab04\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544201 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f256e75d-5ff4-4804-bbe6-058ef24fab04-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f256e75d-5ff4-4804-bbe6-058ef24fab04" (UID: "f256e75d-5ff4-4804-bbe6-058ef24fab04"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544276 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-scripts\") pod \"f256e75d-5ff4-4804-bbe6-058ef24fab04\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544311 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-db-sync-config-data\") pod \"f256e75d-5ff4-4804-bbe6-058ef24fab04\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544434 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-config-data\") pod \"f256e75d-5ff4-4804-bbe6-058ef24fab04\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.544569 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q57hk\" (UniqueName: \"kubernetes.io/projected/f256e75d-5ff4-4804-bbe6-058ef24fab04-kube-api-access-q57hk\") pod \"f256e75d-5ff4-4804-bbe6-058ef24fab04\" (UID: \"f256e75d-5ff4-4804-bbe6-058ef24fab04\") " Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.545495 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f256e75d-5ff4-4804-bbe6-058ef24fab04-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.549840 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-scripts" (OuterVolumeSpecName: "scripts") pod "f256e75d-5ff4-4804-bbe6-058ef24fab04" (UID: "f256e75d-5ff4-4804-bbe6-058ef24fab04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.552021 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f256e75d-5ff4-4804-bbe6-058ef24fab04-kube-api-access-q57hk" (OuterVolumeSpecName: "kube-api-access-q57hk") pod "f256e75d-5ff4-4804-bbe6-058ef24fab04" (UID: "f256e75d-5ff4-4804-bbe6-058ef24fab04"). InnerVolumeSpecName "kube-api-access-q57hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.560983 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f256e75d-5ff4-4804-bbe6-058ef24fab04" (UID: "f256e75d-5ff4-4804-bbe6-058ef24fab04"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.598831 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f256e75d-5ff4-4804-bbe6-058ef24fab04" (UID: "f256e75d-5ff4-4804-bbe6-058ef24fab04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.617799 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-config-data" (OuterVolumeSpecName: "config-data") pod "f256e75d-5ff4-4804-bbe6-058ef24fab04" (UID: "f256e75d-5ff4-4804-bbe6-058ef24fab04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.646687 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.646721 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.646733 4922 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.646743 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f256e75d-5ff4-4804-bbe6-058ef24fab04-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:58 crc kubenswrapper[4922]: I1122 03:10:58.646752 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q57hk\" (UniqueName: \"kubernetes.io/projected/f256e75d-5ff4-4804-bbe6-058ef24fab04-kube-api-access-q57hk\") on node \"crc\" DevicePath \"\"" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.026156 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xgvlf" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.026167 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xgvlf" event={"ID":"f256e75d-5ff4-4804-bbe6-058ef24fab04","Type":"ContainerDied","Data":"3caf19fbb4e921f7dd1e71cbf07c058f7115311e7cbb950748606bb489e773be"} Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.026208 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3caf19fbb4e921f7dd1e71cbf07c058f7115311e7cbb950748606bb489e773be" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319343 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:10:59 crc kubenswrapper[4922]: E1122 03:10:59.319623 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319635 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api" Nov 22 03:10:59 crc kubenswrapper[4922]: E1122 03:10:59.319651 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f256e75d-5ff4-4804-bbe6-058ef24fab04" containerName="cinder-db-sync" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319659 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f256e75d-5ff4-4804-bbe6-058ef24fab04" containerName="cinder-db-sync" Nov 22 03:10:59 crc kubenswrapper[4922]: E1122 03:10:59.319695 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api-log" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319702 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api-log" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319885 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f256e75d-5ff4-4804-bbe6-058ef24fab04" containerName="cinder-db-sync" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319898 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.319918 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f8129a-2a3c-4ee6-bd5f-e5f93cdd5457" containerName="barbican-api-log" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.325134 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.332384 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mslgv" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.332660 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.358058 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.358232 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.367957 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.380768 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54874e8f-0cb8-4295-a7f8-b6265cd72612-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.380820 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.380856 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.380944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwnhw\" (UniqueName: \"kubernetes.io/projected/54874e8f-0cb8-4295-a7f8-b6265cd72612-kube-api-access-kwnhw\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.381009 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-scripts\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.381028 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.483337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwnhw\" (UniqueName: \"kubernetes.io/projected/54874e8f-0cb8-4295-a7f8-b6265cd72612-kube-api-access-kwnhw\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.483421 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-scripts\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.483445 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.483475 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54874e8f-0cb8-4295-a7f8-b6265cd72612-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.483499 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.483521 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.491957 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.492302 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54874e8f-0cb8-4295-a7f8-b6265cd72612-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.507554 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.513018 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-scripts\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.514978 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.520494 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.523719 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwnhw\" (UniqueName: \"kubernetes.io/projected/54874e8f-0cb8-4295-a7f8-b6265cd72612-kube-api-access-kwnhw\") pod \"cinder-scheduler-0\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.531665 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-j5v97"] Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.634896 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-vnmhp"] Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.636297 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.654692 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.655052 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-vnmhp"] Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.689879 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.689965 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.690036 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5dn\" (UniqueName: \"kubernetes.io/projected/5bea70c1-286c-41fa-af9a-42bb8568af3e-kube-api-access-km5dn\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.690099 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.690183 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-config\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.761419 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.762799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.765716 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.773068 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.791864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-scripts\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.791950 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srrj\" (UniqueName: \"kubernetes.io/projected/7a237d2b-602d-4330-a413-59df6708b4d3-kube-api-access-5srrj\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.791974 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-config\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792026 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792044 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a237d2b-602d-4330-a413-59df6708b4d3-logs\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792074 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792096 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792121 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792137 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792178 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a237d2b-602d-4330-a413-59df6708b4d3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.792196 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km5dn\" (UniqueName: \"kubernetes.io/projected/5bea70c1-286c-41fa-af9a-42bb8568af3e-kube-api-access-km5dn\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.793467 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.793644 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.793778 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.794354 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-config\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.814284 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5dn\" (UniqueName: \"kubernetes.io/projected/5bea70c1-286c-41fa-af9a-42bb8568af3e-kube-api-access-km5dn\") pod \"dnsmasq-dns-6d97fcdd8f-vnmhp\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.894770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895076 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a237d2b-602d-4330-a413-59df6708b4d3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895113 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-scripts\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895142 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srrj\" (UniqueName: \"kubernetes.io/projected/7a237d2b-602d-4330-a413-59df6708b4d3-kube-api-access-5srrj\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895185 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895205 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a237d2b-602d-4330-a413-59df6708b4d3-logs\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895242 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.895532 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a237d2b-602d-4330-a413-59df6708b4d3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.896079 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a237d2b-602d-4330-a413-59df6708b4d3-logs\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.900065 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data-custom\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.903218 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.911148 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-scripts\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.914964 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.916559 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srrj\" (UniqueName: \"kubernetes.io/projected/7a237d2b-602d-4330-a413-59df6708b4d3-kube-api-access-5srrj\") pod \"cinder-api-0\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " pod="openstack/cinder-api-0" Nov 22 03:10:59 crc kubenswrapper[4922]: I1122 03:10:59.969689 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.050688 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerName="dnsmasq-dns" containerID="cri-o://2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551" gracePeriod=10 Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.095425 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.306694 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:00 crc kubenswrapper[4922]: W1122 03:11:00.308718 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54874e8f_0cb8_4295_a7f8_b6265cd72612.slice/crio-a0638a84398d8a56bb6116b220366edcb984f4bab9158606fdbb7398131db132 WatchSource:0}: Error finding container a0638a84398d8a56bb6116b220366edcb984f4bab9158606fdbb7398131db132: Status 404 returned error can't find the container with id a0638a84398d8a56bb6116b220366edcb984f4bab9158606fdbb7398131db132 Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.452790 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-vnmhp"] Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.578149 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.643569 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.709459 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-sb\") pod \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.709546 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-nb\") pod \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.710344 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-dns-svc\") pod \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.710403 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwv8q\" (UniqueName: \"kubernetes.io/projected/0d1e5299-5db6-4251-ad4a-c5a1a137035a-kube-api-access-pwv8q\") pod \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.710432 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-config\") pod \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\" (UID: \"0d1e5299-5db6-4251-ad4a-c5a1a137035a\") " Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.726816 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1e5299-5db6-4251-ad4a-c5a1a137035a-kube-api-access-pwv8q" (OuterVolumeSpecName: "kube-api-access-pwv8q") pod "0d1e5299-5db6-4251-ad4a-c5a1a137035a" (UID: "0d1e5299-5db6-4251-ad4a-c5a1a137035a"). InnerVolumeSpecName "kube-api-access-pwv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.803005 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d1e5299-5db6-4251-ad4a-c5a1a137035a" (UID: "0d1e5299-5db6-4251-ad4a-c5a1a137035a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.803072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d1e5299-5db6-4251-ad4a-c5a1a137035a" (UID: "0d1e5299-5db6-4251-ad4a-c5a1a137035a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.811492 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d1e5299-5db6-4251-ad4a-c5a1a137035a" (UID: "0d1e5299-5db6-4251-ad4a-c5a1a137035a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.812653 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.812671 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.812682 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.812695 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwv8q\" (UniqueName: \"kubernetes.io/projected/0d1e5299-5db6-4251-ad4a-c5a1a137035a-kube-api-access-pwv8q\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.823247 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-config" (OuterVolumeSpecName: "config") pod "0d1e5299-5db6-4251-ad4a-c5a1a137035a" (UID: "0d1e5299-5db6-4251-ad4a-c5a1a137035a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:00 crc kubenswrapper[4922]: I1122 03:11:00.913626 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1e5299-5db6-4251-ad4a-c5a1a137035a-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.061776 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"54874e8f-0cb8-4295-a7f8-b6265cd72612","Type":"ContainerStarted","Data":"a0638a84398d8a56bb6116b220366edcb984f4bab9158606fdbb7398131db132"} Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.063376 4922 generic.go:334] "Generic (PLEG): container finished" podID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerID="680949f8dfc31583d928406ee647a731c9fdb8d61efe1a880064c100f2ad4db9" exitCode=0 Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.063453 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" event={"ID":"5bea70c1-286c-41fa-af9a-42bb8568af3e","Type":"ContainerDied","Data":"680949f8dfc31583d928406ee647a731c9fdb8d61efe1a880064c100f2ad4db9"} Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.063504 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" event={"ID":"5bea70c1-286c-41fa-af9a-42bb8568af3e","Type":"ContainerStarted","Data":"f26e5d274f05a6b149510f200397fabe8ebf0c366b78795ac7db98f68a7de1b2"} Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.065408 4922 generic.go:334] "Generic (PLEG): container finished" podID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerID="2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551" exitCode=0 Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.065468 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" event={"ID":"0d1e5299-5db6-4251-ad4a-c5a1a137035a","Type":"ContainerDied","Data":"2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551"} Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.065507 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.065527 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-j5v97" event={"ID":"0d1e5299-5db6-4251-ad4a-c5a1a137035a","Type":"ContainerDied","Data":"d7d8323ddcf9e785723ca7869a60f5d16a45800adffdf5c51ed7a35db6f21484"} Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.065552 4922 scope.go:117] "RemoveContainer" containerID="2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.067496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a237d2b-602d-4330-a413-59df6708b4d3","Type":"ContainerStarted","Data":"8a42cdd5dd58bcec48691cf3a853270d7d0213f153e3ad1f9eb0be3d6bf10e74"} Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.109420 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-j5v97"] Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.116258 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-j5v97"] Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.141609 4922 scope.go:117] "RemoveContainer" containerID="73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.239354 4922 scope.go:117] "RemoveContainer" containerID="2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551" Nov 22 03:11:01 crc kubenswrapper[4922]: E1122 03:11:01.239697 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551\": container with ID starting with 2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551 not found: ID does not exist" containerID="2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.239723 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551"} err="failed to get container status \"2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551\": rpc error: code = NotFound desc = could not find container \"2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551\": container with ID starting with 2f37fe6e35bfeaa4d51267961141ddf34c899a6c7572c91aab8109a10a76d551 not found: ID does not exist" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.239742 4922 scope.go:117] "RemoveContainer" containerID="73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf" Nov 22 03:11:01 crc kubenswrapper[4922]: E1122 03:11:01.239999 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf\": container with ID starting with 73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf not found: ID does not exist" containerID="73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.240044 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf"} err="failed to get container status \"73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf\": rpc error: code = NotFound desc = could not find container \"73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf\": container with ID starting with 73bd7ccd44b5df7409f75a41cb58148156c596696fc2a130b67d8a622dc1b9bf not found: ID does not exist" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.312574 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" path="/var/lib/kubelet/pods/0d1e5299-5db6-4251-ad4a-c5a1a137035a/volumes" Nov 22 03:11:01 crc kubenswrapper[4922]: I1122 03:11:01.707613 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.102076 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"54874e8f-0cb8-4295-a7f8-b6265cd72612","Type":"ContainerStarted","Data":"0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee"} Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.135629 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" event={"ID":"5bea70c1-286c-41fa-af9a-42bb8568af3e","Type":"ContainerStarted","Data":"92888f6953488ada7ba19dff750a370e33f1d2e025ff5385b66595d73bfaea9c"} Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.136468 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.152651 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a237d2b-602d-4330-a413-59df6708b4d3","Type":"ContainerStarted","Data":"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58"} Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.172303 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" podStartSLOduration=3.172285677 podStartE2EDuration="3.172285677s" podCreationTimestamp="2025-11-22 03:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:02.160988046 +0000 UTC m=+1098.199509938" watchObservedRunningTime="2025-11-22 03:11:02.172285677 +0000 UTC m=+1098.210807569" Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.886388 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:11:02 crc kubenswrapper[4922]: I1122 03:11:02.887016 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56c6fc5546-zz2lj" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.171284 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"54874e8f-0cb8-4295-a7f8-b6265cd72612","Type":"ContainerStarted","Data":"c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a"} Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.174934 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a237d2b-602d-4330-a413-59df6708b4d3","Type":"ContainerStarted","Data":"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360"} Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.175805 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api-log" containerID="cri-o://55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58" gracePeriod=30 Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.175920 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api" containerID="cri-o://83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360" gracePeriod=30 Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.197391 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.293251245 podStartE2EDuration="4.197366268s" podCreationTimestamp="2025-11-22 03:10:59 +0000 UTC" firstStartedPulling="2025-11-22 03:11:00.315952758 +0000 UTC m=+1096.354474650" lastFinishedPulling="2025-11-22 03:11:01.220067771 +0000 UTC m=+1097.258589673" observedRunningTime="2025-11-22 03:11:03.193707741 +0000 UTC m=+1099.232229633" watchObservedRunningTime="2025-11-22 03:11:03.197366268 +0000 UTC m=+1099.235888160" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.219225 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.219204422 podStartE2EDuration="4.219204422s" podCreationTimestamp="2025-11-22 03:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:03.212538252 +0000 UTC m=+1099.251060154" watchObservedRunningTime="2025-11-22 03:11:03.219204422 +0000 UTC m=+1099.257726314" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.780767 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.872988 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a237d2b-602d-4330-a413-59df6708b4d3-logs\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data-custom\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873120 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a237d2b-602d-4330-a413-59df6708b4d3-etc-machine-id\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873196 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-combined-ca-bundle\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873243 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-scripts\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873294 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5srrj\" (UniqueName: \"kubernetes.io/projected/7a237d2b-602d-4330-a413-59df6708b4d3-kube-api-access-5srrj\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873390 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data\") pod \"7a237d2b-602d-4330-a413-59df6708b4d3\" (UID: \"7a237d2b-602d-4330-a413-59df6708b4d3\") " Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873402 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a237d2b-602d-4330-a413-59df6708b4d3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873483 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a237d2b-602d-4330-a413-59df6708b4d3-logs" (OuterVolumeSpecName: "logs") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873818 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a237d2b-602d-4330-a413-59df6708b4d3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.873836 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a237d2b-602d-4330-a413-59df6708b4d3-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.878949 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-scripts" (OuterVolumeSpecName: "scripts") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.883023 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a237d2b-602d-4330-a413-59df6708b4d3-kube-api-access-5srrj" (OuterVolumeSpecName: "kube-api-access-5srrj") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "kube-api-access-5srrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.883993 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.901885 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.924631 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data" (OuterVolumeSpecName: "config-data") pod "7a237d2b-602d-4330-a413-59df6708b4d3" (UID: "7a237d2b-602d-4330-a413-59df6708b4d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.974995 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.975053 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.975063 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5srrj\" (UniqueName: \"kubernetes.io/projected/7a237d2b-602d-4330-a413-59df6708b4d3-kube-api-access-5srrj\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.975074 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:03 crc kubenswrapper[4922]: I1122 03:11:03.975085 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a237d2b-602d-4330-a413-59df6708b4d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199612 4922 generic.go:334] "Generic (PLEG): container finished" podID="7a237d2b-602d-4330-a413-59df6708b4d3" containerID="83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360" exitCode=0 Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199653 4922 generic.go:334] "Generic (PLEG): container finished" podID="7a237d2b-602d-4330-a413-59df6708b4d3" containerID="55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58" exitCode=143 Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199744 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a237d2b-602d-4330-a413-59df6708b4d3","Type":"ContainerDied","Data":"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360"} Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199834 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a237d2b-602d-4330-a413-59df6708b4d3","Type":"ContainerDied","Data":"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58"} Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199879 4922 scope.go:117] "RemoveContainer" containerID="83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.199909 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7a237d2b-602d-4330-a413-59df6708b4d3","Type":"ContainerDied","Data":"8a42cdd5dd58bcec48691cf3a853270d7d0213f153e3ad1f9eb0be3d6bf10e74"} Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.236434 4922 scope.go:117] "RemoveContainer" containerID="55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.246433 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.254371 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.258168 4922 scope.go:117] "RemoveContainer" containerID="83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360" Nov 22 03:11:04 crc kubenswrapper[4922]: E1122 03:11:04.258663 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360\": container with ID starting with 83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360 not found: ID does not exist" containerID="83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.258704 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360"} err="failed to get container status \"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360\": rpc error: code = NotFound desc = could not find container \"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360\": container with ID starting with 83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360 not found: ID does not exist" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.258726 4922 scope.go:117] "RemoveContainer" containerID="55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58" Nov 22 03:11:04 crc kubenswrapper[4922]: E1122 03:11:04.259064 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58\": container with ID starting with 55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58 not found: ID does not exist" containerID="55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.259086 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58"} err="failed to get container status \"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58\": rpc error: code = NotFound desc = could not find container \"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58\": container with ID starting with 55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58 not found: ID does not exist" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.259107 4922 scope.go:117] "RemoveContainer" containerID="83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.260878 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360"} err="failed to get container status \"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360\": rpc error: code = NotFound desc = could not find container \"83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360\": container with ID starting with 83da513231a638efdc2c8a5e91ff63918cc92fb9b9a68035bbb944089c535360 not found: ID does not exist" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.260900 4922 scope.go:117] "RemoveContainer" containerID="55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.261150 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58"} err="failed to get container status \"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58\": rpc error: code = NotFound desc = could not find container \"55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58\": container with ID starting with 55fc01849451a2795728718dc33869ac00e8ee0f054dbe9e71adccaa36f93a58 not found: ID does not exist" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266125 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:04 crc kubenswrapper[4922]: E1122 03:11:04.266589 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerName="dnsmasq-dns" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266612 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerName="dnsmasq-dns" Nov 22 03:11:04 crc kubenswrapper[4922]: E1122 03:11:04.266634 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266644 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api" Nov 22 03:11:04 crc kubenswrapper[4922]: E1122 03:11:04.266658 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api-log" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266665 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api-log" Nov 22 03:11:04 crc kubenswrapper[4922]: E1122 03:11:04.266674 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerName="init" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266679 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerName="init" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266956 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d1e5299-5db6-4251-ad4a-c5a1a137035a" containerName="dnsmasq-dns" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266986 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.266998 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" containerName="cinder-api-log" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.268174 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.274404 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.274546 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.274588 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.282399 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381267 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381517 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1ed907-da19-4420-b5d8-3523a3020796-logs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381560 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-config-data\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381587 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381629 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-scripts\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381730 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a1ed907-da19-4420-b5d8-3523a3020796-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381774 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.381821 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzmmg\" (UniqueName: \"kubernetes.io/projected/8a1ed907-da19-4420-b5d8-3523a3020796-kube-api-access-jzmmg\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.483745 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.484921 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1ed907-da19-4420-b5d8-3523a3020796-logs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485046 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-config-data\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485141 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485236 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-scripts\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485463 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a1ed907-da19-4420-b5d8-3523a3020796-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485537 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a1ed907-da19-4420-b5d8-3523a3020796-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485630 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485406 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a1ed907-da19-4420-b5d8-3523a3020796-logs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.491958 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-config-data\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.485775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzmmg\" (UniqueName: \"kubernetes.io/projected/8a1ed907-da19-4420-b5d8-3523a3020796-kube-api-access-jzmmg\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.493034 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.495970 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-scripts\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.496328 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.497628 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.511293 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a1ed907-da19-4420-b5d8-3523a3020796-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.519758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzmmg\" (UniqueName: \"kubernetes.io/projected/8a1ed907-da19-4420-b5d8-3523a3020796-kube-api-access-jzmmg\") pod \"cinder-api-0\" (UID: \"8a1ed907-da19-4420-b5d8-3523a3020796\") " pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.628031 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 03:11:04 crc kubenswrapper[4922]: I1122 03:11:04.657414 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 03:11:05 crc kubenswrapper[4922]: W1122 03:11:05.068201 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a1ed907_da19_4420_b5d8_3523a3020796.slice/crio-3a8f8aed9d33105323dcde3c408e3a918382df8424d96b1152da52ae07adce86 WatchSource:0}: Error finding container 3a8f8aed9d33105323dcde3c408e3a918382df8424d96b1152da52ae07adce86: Status 404 returned error can't find the container with id 3a8f8aed9d33105323dcde3c408e3a918382df8424d96b1152da52ae07adce86 Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.069160 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.102557 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-7347-account-create-h8cg8"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.105116 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.106784 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.129301 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7347-account-create-h8cg8"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.213373 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcq2\" (UniqueName: \"kubernetes.io/projected/b5a72db7-2799-420c-a519-b87d52b4c2f8-kube-api-access-5gcq2\") pod \"nova-api-7347-account-create-h8cg8\" (UID: \"b5a72db7-2799-420c-a519-b87d52b4c2f8\") " pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.225958 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a1ed907-da19-4420-b5d8-3523a3020796","Type":"ContainerStarted","Data":"3a8f8aed9d33105323dcde3c408e3a918382df8424d96b1152da52ae07adce86"} Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.315478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcq2\" (UniqueName: \"kubernetes.io/projected/b5a72db7-2799-420c-a519-b87d52b4c2f8-kube-api-access-5gcq2\") pod \"nova-api-7347-account-create-h8cg8\" (UID: \"b5a72db7-2799-420c-a519-b87d52b4c2f8\") " pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.317291 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a237d2b-602d-4330-a413-59df6708b4d3" path="/var/lib/kubelet/pods/7a237d2b-602d-4330-a413-59df6708b4d3/volumes" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.318510 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1f35-account-create-rjd56"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.320147 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.322446 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.334056 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1f35-account-create-rjd56"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.346766 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcq2\" (UniqueName: \"kubernetes.io/projected/b5a72db7-2799-420c-a519-b87d52b4c2f8-kube-api-access-5gcq2\") pod \"nova-api-7347-account-create-h8cg8\" (UID: \"b5a72db7-2799-420c-a519-b87d52b4c2f8\") " pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.416958 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64xr\" (UniqueName: \"kubernetes.io/projected/345a62f6-ef27-4695-89bf-4a0f47e9b7c5-kube-api-access-z64xr\") pod \"nova-cell0-1f35-account-create-rjd56\" (UID: \"345a62f6-ef27-4695-89bf-4a0f47e9b7c5\") " pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.440238 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.520018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64xr\" (UniqueName: \"kubernetes.io/projected/345a62f6-ef27-4695-89bf-4a0f47e9b7c5-kube-api-access-z64xr\") pod \"nova-cell0-1f35-account-create-rjd56\" (UID: \"345a62f6-ef27-4695-89bf-4a0f47e9b7c5\") " pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.524304 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2bf9-account-create-4565f"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.525502 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.529196 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.537216 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64xr\" (UniqueName: \"kubernetes.io/projected/345a62f6-ef27-4695-89bf-4a0f47e9b7c5-kube-api-access-z64xr\") pod \"nova-cell0-1f35-account-create-rjd56\" (UID: \"345a62f6-ef27-4695-89bf-4a0f47e9b7c5\") " pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.539287 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2bf9-account-create-4565f"] Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.621502 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtkd\" (UniqueName: \"kubernetes.io/projected/fd67a834-6ea6-4a59-b57a-00add67daa32-kube-api-access-tgtkd\") pod \"nova-cell1-2bf9-account-create-4565f\" (UID: \"fd67a834-6ea6-4a59-b57a-00add67daa32\") " pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.723188 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtkd\" (UniqueName: \"kubernetes.io/projected/fd67a834-6ea6-4a59-b57a-00add67daa32-kube-api-access-tgtkd\") pod \"nova-cell1-2bf9-account-create-4565f\" (UID: \"fd67a834-6ea6-4a59-b57a-00add67daa32\") " pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.723942 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.774560 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtkd\" (UniqueName: \"kubernetes.io/projected/fd67a834-6ea6-4a59-b57a-00add67daa32-kube-api-access-tgtkd\") pod \"nova-cell1-2bf9-account-create-4565f\" (UID: \"fd67a834-6ea6-4a59-b57a-00add67daa32\") " pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:05 crc kubenswrapper[4922]: I1122 03:11:05.922175 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.071395 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-7347-account-create-h8cg8"] Nov 22 03:11:06 crc kubenswrapper[4922]: W1122 03:11:06.089996 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a72db7_2799_420c_a519_b87d52b4c2f8.slice/crio-a3103b83f7f4a5156a316795f472a4afecb3e13627d81987834405e570de99f3 WatchSource:0}: Error finding container a3103b83f7f4a5156a316795f472a4afecb3e13627d81987834405e570de99f3: Status 404 returned error can't find the container with id a3103b83f7f4a5156a316795f472a4afecb3e13627d81987834405e570de99f3 Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.255689 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a1ed907-da19-4420-b5d8-3523a3020796","Type":"ContainerStarted","Data":"a6484bca2f6b0671f140e1bab448fc2a76f7304f735aefb26f50c7016b577032"} Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.255816 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1f35-account-create-rjd56"] Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.260367 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7347-account-create-h8cg8" event={"ID":"b5a72db7-2799-420c-a519-b87d52b4c2f8","Type":"ContainerStarted","Data":"a3103b83f7f4a5156a316795f472a4afecb3e13627d81987834405e570de99f3"} Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.282343 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.282634 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="proxy-httpd" containerID="cri-o://a65d54a3937659b5a58f9c195d7200c68b24faf8a7390d01c3fc06be0aa43812" gracePeriod=30 Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.282666 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="sg-core" containerID="cri-o://84ce9ada924799311a337f4bda8399899f52d19b2ca775f0910be28158b2507a" gracePeriod=30 Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.282722 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-notification-agent" containerID="cri-o://560a4a5c608d011b99a869b7235b014e760215178450ddc88e09b47d44cafd62" gracePeriod=30 Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.282610 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-central-agent" containerID="cri-o://937e59c825e19e64051ec5abfabd8307d3c56110aec5294159e69b93b3e794ab" gracePeriod=30 Nov 22 03:11:06 crc kubenswrapper[4922]: I1122 03:11:06.349582 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2bf9-account-create-4565f"] Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.271855 4922 generic.go:334] "Generic (PLEG): container finished" podID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerID="a65d54a3937659b5a58f9c195d7200c68b24faf8a7390d01c3fc06be0aa43812" exitCode=0 Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.272154 4922 generic.go:334] "Generic (PLEG): container finished" podID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerID="84ce9ada924799311a337f4bda8399899f52d19b2ca775f0910be28158b2507a" exitCode=2 Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.272165 4922 generic.go:334] "Generic (PLEG): container finished" podID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerID="937e59c825e19e64051ec5abfabd8307d3c56110aec5294159e69b93b3e794ab" exitCode=0 Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.272201 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerDied","Data":"a65d54a3937659b5a58f9c195d7200c68b24faf8a7390d01c3fc06be0aa43812"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.272225 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerDied","Data":"84ce9ada924799311a337f4bda8399899f52d19b2ca775f0910be28158b2507a"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.272234 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerDied","Data":"937e59c825e19e64051ec5abfabd8307d3c56110aec5294159e69b93b3e794ab"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.274911 4922 generic.go:334] "Generic (PLEG): container finished" podID="fd67a834-6ea6-4a59-b57a-00add67daa32" containerID="9e02d916b38320aa828bed472018c9efb01ca7e640be52fb2798cd924c7eaefe" exitCode=0 Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.275012 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2bf9-account-create-4565f" event={"ID":"fd67a834-6ea6-4a59-b57a-00add67daa32","Type":"ContainerDied","Data":"9e02d916b38320aa828bed472018c9efb01ca7e640be52fb2798cd924c7eaefe"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.275044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2bf9-account-create-4565f" event={"ID":"fd67a834-6ea6-4a59-b57a-00add67daa32","Type":"ContainerStarted","Data":"0b90ec77d298cb0d4f73fd2ecb2b0a10cff695037b1fda01e11b483cab695daa"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.276749 4922 generic.go:334] "Generic (PLEG): container finished" podID="345a62f6-ef27-4695-89bf-4a0f47e9b7c5" containerID="559563baf27b35c4b2d7025184e49acec34716dab61b7c583423892a2341c9e8" exitCode=0 Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.276833 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1f35-account-create-rjd56" event={"ID":"345a62f6-ef27-4695-89bf-4a0f47e9b7c5","Type":"ContainerDied","Data":"559563baf27b35c4b2d7025184e49acec34716dab61b7c583423892a2341c9e8"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.276879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1f35-account-create-rjd56" event={"ID":"345a62f6-ef27-4695-89bf-4a0f47e9b7c5","Type":"ContainerStarted","Data":"83fba203de24147a0e1ad4f1bce91a7d3d906f9f4da506f5ec7ca5c315ea3979"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.279409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a1ed907-da19-4420-b5d8-3523a3020796","Type":"ContainerStarted","Data":"4461e26ac1a9a584b676b94e88f42b0b267cdcb78abc5d6fb9061d5f0592948b"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.279617 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.281348 4922 generic.go:334] "Generic (PLEG): container finished" podID="b5a72db7-2799-420c-a519-b87d52b4c2f8" containerID="98e5df228c56af426f3fb084ade9b55926281ad7bf69f0c8360a68430e3b9a92" exitCode=0 Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.281379 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7347-account-create-h8cg8" event={"ID":"b5a72db7-2799-420c-a519-b87d52b4c2f8","Type":"ContainerDied","Data":"98e5df228c56af426f3fb084ade9b55926281ad7bf69f0c8360a68430e3b9a92"} Nov 22 03:11:07 crc kubenswrapper[4922]: I1122 03:11:07.330046 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.330024022 podStartE2EDuration="3.330024022s" podCreationTimestamp="2025-11-22 03:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:07.322344749 +0000 UTC m=+1103.360866661" watchObservedRunningTime="2025-11-22 03:11:07.330024022 +0000 UTC m=+1103.368545924" Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.295046 4922 generic.go:334] "Generic (PLEG): container finished" podID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerID="560a4a5c608d011b99a869b7235b014e760215178450ddc88e09b47d44cafd62" exitCode=0 Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.295280 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerDied","Data":"560a4a5c608d011b99a869b7235b014e760215178450ddc88e09b47d44cafd62"} Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.781495 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.898891 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64xr\" (UniqueName: \"kubernetes.io/projected/345a62f6-ef27-4695-89bf-4a0f47e9b7c5-kube-api-access-z64xr\") pod \"345a62f6-ef27-4695-89bf-4a0f47e9b7c5\" (UID: \"345a62f6-ef27-4695-89bf-4a0f47e9b7c5\") " Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.903899 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.904189 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345a62f6-ef27-4695-89bf-4a0f47e9b7c5-kube-api-access-z64xr" (OuterVolumeSpecName: "kube-api-access-z64xr") pod "345a62f6-ef27-4695-89bf-4a0f47e9b7c5" (UID: "345a62f6-ef27-4695-89bf-4a0f47e9b7c5"). InnerVolumeSpecName "kube-api-access-z64xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.986196 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:08 crc kubenswrapper[4922]: I1122 03:11:08.992828 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.000438 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcq2\" (UniqueName: \"kubernetes.io/projected/b5a72db7-2799-420c-a519-b87d52b4c2f8-kube-api-access-5gcq2\") pod \"b5a72db7-2799-420c-a519-b87d52b4c2f8\" (UID: \"b5a72db7-2799-420c-a519-b87d52b4c2f8\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.001022 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64xr\" (UniqueName: \"kubernetes.io/projected/345a62f6-ef27-4695-89bf-4a0f47e9b7c5-kube-api-access-z64xr\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.003552 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5a72db7-2799-420c-a519-b87d52b4c2f8-kube-api-access-5gcq2" (OuterVolumeSpecName: "kube-api-access-5gcq2") pod "b5a72db7-2799-420c-a519-b87d52b4c2f8" (UID: "b5a72db7-2799-420c-a519-b87d52b4c2f8"). InnerVolumeSpecName "kube-api-access-5gcq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102402 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-run-httpd\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102460 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-combined-ca-bundle\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102496 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-sg-core-conf-yaml\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102559 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-scripts\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102692 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgtkd\" (UniqueName: \"kubernetes.io/projected/fd67a834-6ea6-4a59-b57a-00add67daa32-kube-api-access-tgtkd\") pod \"fd67a834-6ea6-4a59-b57a-00add67daa32\" (UID: \"fd67a834-6ea6-4a59-b57a-00add67daa32\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102747 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65hbj\" (UniqueName: \"kubernetes.io/projected/9843715c-f3c1-4c18-af9e-545d5fa5da4e-kube-api-access-65hbj\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102824 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-log-httpd\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102911 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-config-data\") pod \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\" (UID: \"9843715c-f3c1-4c18-af9e-545d5fa5da4e\") " Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.102962 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.103322 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.104108 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.104145 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9843715c-f3c1-4c18-af9e-545d5fa5da4e-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.104164 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcq2\" (UniqueName: \"kubernetes.io/projected/b5a72db7-2799-420c-a519-b87d52b4c2f8-kube-api-access-5gcq2\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.105723 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-scripts" (OuterVolumeSpecName: "scripts") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.106173 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd67a834-6ea6-4a59-b57a-00add67daa32-kube-api-access-tgtkd" (OuterVolumeSpecName: "kube-api-access-tgtkd") pod "fd67a834-6ea6-4a59-b57a-00add67daa32" (UID: "fd67a834-6ea6-4a59-b57a-00add67daa32"). InnerVolumeSpecName "kube-api-access-tgtkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.106741 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9843715c-f3c1-4c18-af9e-545d5fa5da4e-kube-api-access-65hbj" (OuterVolumeSpecName: "kube-api-access-65hbj") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "kube-api-access-65hbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.127975 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.165370 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.188915 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-config-data" (OuterVolumeSpecName: "config-data") pod "9843715c-f3c1-4c18-af9e-545d5fa5da4e" (UID: "9843715c-f3c1-4c18-af9e-545d5fa5da4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.205531 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.205563 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.205575 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.205583 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9843715c-f3c1-4c18-af9e-545d5fa5da4e-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.205592 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgtkd\" (UniqueName: \"kubernetes.io/projected/fd67a834-6ea6-4a59-b57a-00add67daa32-kube-api-access-tgtkd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.205602 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65hbj\" (UniqueName: \"kubernetes.io/projected/9843715c-f3c1-4c18-af9e-545d5fa5da4e-kube-api-access-65hbj\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.310615 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2bf9-account-create-4565f" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.312848 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1f35-account-create-rjd56" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.314215 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-7347-account-create-h8cg8" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.316939 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.321578 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2bf9-account-create-4565f" event={"ID":"fd67a834-6ea6-4a59-b57a-00add67daa32","Type":"ContainerDied","Data":"0b90ec77d298cb0d4f73fd2ecb2b0a10cff695037b1fda01e11b483cab695daa"} Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.324981 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b90ec77d298cb0d4f73fd2ecb2b0a10cff695037b1fda01e11b483cab695daa" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.325043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1f35-account-create-rjd56" event={"ID":"345a62f6-ef27-4695-89bf-4a0f47e9b7c5","Type":"ContainerDied","Data":"83fba203de24147a0e1ad4f1bce91a7d3d906f9f4da506f5ec7ca5c315ea3979"} Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.327999 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83fba203de24147a0e1ad4f1bce91a7d3d906f9f4da506f5ec7ca5c315ea3979" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.328055 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-7347-account-create-h8cg8" event={"ID":"b5a72db7-2799-420c-a519-b87d52b4c2f8","Type":"ContainerDied","Data":"a3103b83f7f4a5156a316795f472a4afecb3e13627d81987834405e570de99f3"} Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.328074 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3103b83f7f4a5156a316795f472a4afecb3e13627d81987834405e570de99f3" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.328090 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9843715c-f3c1-4c18-af9e-545d5fa5da4e","Type":"ContainerDied","Data":"455db90142fa5b0ee7bc33f5ce1076dbf7fc0d1515cce6bb8e3177d8a7d3b3c8"} Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.328158 4922 scope.go:117] "RemoveContainer" containerID="a65d54a3937659b5a58f9c195d7200c68b24faf8a7390d01c3fc06be0aa43812" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.353222 4922 scope.go:117] "RemoveContainer" containerID="84ce9ada924799311a337f4bda8399899f52d19b2ca775f0910be28158b2507a" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.382605 4922 scope.go:117] "RemoveContainer" containerID="560a4a5c608d011b99a869b7235b014e760215178450ddc88e09b47d44cafd62" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.390485 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.402918 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.427794 4922 scope.go:117] "RemoveContainer" containerID="937e59c825e19e64051ec5abfabd8307d3c56110aec5294159e69b93b3e794ab" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.454747 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455216 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345a62f6-ef27-4695-89bf-4a0f47e9b7c5" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455232 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="345a62f6-ef27-4695-89bf-4a0f47e9b7c5" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455254 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5a72db7-2799-420c-a519-b87d52b4c2f8" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455262 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5a72db7-2799-420c-a519-b87d52b4c2f8" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455274 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-central-agent" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455283 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-central-agent" Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455294 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-notification-agent" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455303 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-notification-agent" Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455368 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd67a834-6ea6-4a59-b57a-00add67daa32" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455376 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd67a834-6ea6-4a59-b57a-00add67daa32" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455394 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="proxy-httpd" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455402 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="proxy-httpd" Nov 22 03:11:09 crc kubenswrapper[4922]: E1122 03:11:09.455412 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="sg-core" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455420 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="sg-core" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455628 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5a72db7-2799-420c-a519-b87d52b4c2f8" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455643 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd67a834-6ea6-4a59-b57a-00add67daa32" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455655 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-central-agent" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455671 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="345a62f6-ef27-4695-89bf-4a0f47e9b7c5" containerName="mariadb-account-create" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455684 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="sg-core" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455700 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="ceilometer-notification-agent" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.455716 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" containerName="proxy-httpd" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.458400 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.460782 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.461348 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.473738 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516516 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516606 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-config-data\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516689 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-scripts\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516712 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-run-httpd\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516725 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-log-httpd\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.516745 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfrq\" (UniqueName: \"kubernetes.io/projected/7fe05513-2eda-43de-b464-3731563f64a5-kube-api-access-rdfrq\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.618819 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-config-data\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.618883 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-scripts\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.618910 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-run-httpd\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.618929 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-log-httpd\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.618952 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfrq\" (UniqueName: \"kubernetes.io/projected/7fe05513-2eda-43de-b464-3731563f64a5-kube-api-access-rdfrq\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.619028 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.619055 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.619500 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-run-httpd\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.619543 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-log-httpd\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.622277 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.622662 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-scripts\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.623417 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-config-data\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.623441 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.634570 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfrq\" (UniqueName: \"kubernetes.io/projected/7fe05513-2eda-43de-b464-3731563f64a5-kube-api-access-rdfrq\") pod \"ceilometer-0\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.777797 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.850426 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.896244 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:09 crc kubenswrapper[4922]: I1122 03:11:09.972986 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.032891 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-6xcfl"] Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.033161 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerName="dnsmasq-dns" containerID="cri-o://1f2242e19de8d8a903b5b8ee00f2961be591707a1c4188b803f073e34e517863" gracePeriod=10 Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.277624 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.342119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerStarted","Data":"f219096a500b1c1aed3e1878b1a9c9e5cb2f4fe8271b1b158ad41d015175357a"} Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.349334 4922 generic.go:334] "Generic (PLEG): container finished" podID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerID="1f2242e19de8d8a903b5b8ee00f2961be591707a1c4188b803f073e34e517863" exitCode=0 Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.349437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" event={"ID":"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961","Type":"ContainerDied","Data":"1f2242e19de8d8a903b5b8ee00f2961be591707a1c4188b803f073e34e517863"} Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.349608 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="probe" containerID="cri-o://c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a" gracePeriod=30 Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.349778 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="cinder-scheduler" containerID="cri-o://0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee" gracePeriod=30 Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.437171 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.516277 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8fkq"] Nov 22 03:11:10 crc kubenswrapper[4922]: E1122 03:11:10.516669 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerName="init" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.516686 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerName="init" Nov 22 03:11:10 crc kubenswrapper[4922]: E1122 03:11:10.516712 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerName="dnsmasq-dns" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.516718 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerName="dnsmasq-dns" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.516979 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" containerName="dnsmasq-dns" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.517567 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.519594 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-564hd" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.519765 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.521562 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.531073 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8fkq"] Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.542321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-dns-svc\") pod \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.542367 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-sb\") pod \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.542449 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-config\") pod \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.542471 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jltpw\" (UniqueName: \"kubernetes.io/projected/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-kube-api-access-jltpw\") pod \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.542507 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-nb\") pod \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\" (UID: \"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961\") " Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.549168 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-kube-api-access-jltpw" (OuterVolumeSpecName: "kube-api-access-jltpw") pod "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" (UID: "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961"). InnerVolumeSpecName "kube-api-access-jltpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.584658 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" (UID: "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.585175 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" (UID: "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.598179 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" (UID: "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.601291 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-config" (OuterVolumeSpecName: "config") pod "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" (UID: "29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644404 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc4k8\" (UniqueName: \"kubernetes.io/projected/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-kube-api-access-gc4k8\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644489 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-scripts\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644517 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644547 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-config-data\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644607 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644617 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644627 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644637 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jltpw\" (UniqueName: \"kubernetes.io/projected/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-kube-api-access-jltpw\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.644646 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.746218 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-scripts\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.746495 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.746544 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-config-data\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.746652 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc4k8\" (UniqueName: \"kubernetes.io/projected/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-kube-api-access-gc4k8\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.751310 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.752180 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-scripts\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.752675 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-config-data\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.768154 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc4k8\" (UniqueName: \"kubernetes.io/projected/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-kube-api-access-gc4k8\") pod \"nova-cell0-conductor-db-sync-h8fkq\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:10 crc kubenswrapper[4922]: I1122 03:11:10.840266 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.109703 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.109994 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.314005 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9843715c-f3c1-4c18-af9e-545d5fa5da4e" path="/var/lib/kubelet/pods/9843715c-f3c1-4c18-af9e-545d5fa5da4e/volumes" Nov 22 03:11:11 crc kubenswrapper[4922]: W1122 03:11:11.320855 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0138cd_b295_44c3_9f0b_1d5de2c8d144.slice/crio-f9cceb699a929561bdf17b9e26f55141af404694267998f7d3c190730abd656d WatchSource:0}: Error finding container f9cceb699a929561bdf17b9e26f55141af404694267998f7d3c190730abd656d: Status 404 returned error can't find the container with id f9cceb699a929561bdf17b9e26f55141af404694267998f7d3c190730abd656d Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.334334 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8fkq"] Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.366635 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" event={"ID":"5b0138cd-b295-44c3-9f0b-1d5de2c8d144","Type":"ContainerStarted","Data":"f9cceb699a929561bdf17b9e26f55141af404694267998f7d3c190730abd656d"} Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.370871 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" event={"ID":"29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961","Type":"ContainerDied","Data":"94b5dbd97f083f48d1c4b75f7271373f2e80371c98a3d69df264fee8afcd9e73"} Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.370931 4922 scope.go:117] "RemoveContainer" containerID="1f2242e19de8d8a903b5b8ee00f2961be591707a1c4188b803f073e34e517863" Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.371056 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-6xcfl" Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.374352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerStarted","Data":"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70"} Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.397909 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-6xcfl"] Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.403388 4922 scope.go:117] "RemoveContainer" containerID="48dca4276c26144a3b1cecec137bb554c8dd9a4bcd59f9881e9623a6b4690e51" Nov 22 03:11:11 crc kubenswrapper[4922]: I1122 03:11:11.408861 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-6xcfl"] Nov 22 03:11:12 crc kubenswrapper[4922]: I1122 03:11:12.275917 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:12 crc kubenswrapper[4922]: I1122 03:11:12.405723 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerStarted","Data":"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184"} Nov 22 03:11:12 crc kubenswrapper[4922]: I1122 03:11:12.406064 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerStarted","Data":"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76"} Nov 22 03:11:12 crc kubenswrapper[4922]: I1122 03:11:12.414121 4922 generic.go:334] "Generic (PLEG): container finished" podID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerID="c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a" exitCode=0 Nov 22 03:11:12 crc kubenswrapper[4922]: I1122 03:11:12.414190 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"54874e8f-0cb8-4295-a7f8-b6265cd72612","Type":"ContainerDied","Data":"c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a"} Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.280124 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.296959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-scripts\") pod \"54874e8f-0cb8-4295-a7f8-b6265cd72612\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.297020 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwnhw\" (UniqueName: \"kubernetes.io/projected/54874e8f-0cb8-4295-a7f8-b6265cd72612-kube-api-access-kwnhw\") pod \"54874e8f-0cb8-4295-a7f8-b6265cd72612\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.297053 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data\") pod \"54874e8f-0cb8-4295-a7f8-b6265cd72612\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.297246 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data-custom\") pod \"54874e8f-0cb8-4295-a7f8-b6265cd72612\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.297311 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54874e8f-0cb8-4295-a7f8-b6265cd72612-etc-machine-id\") pod \"54874e8f-0cb8-4295-a7f8-b6265cd72612\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.297344 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-combined-ca-bundle\") pod \"54874e8f-0cb8-4295-a7f8-b6265cd72612\" (UID: \"54874e8f-0cb8-4295-a7f8-b6265cd72612\") " Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.308894 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54874e8f-0cb8-4295-a7f8-b6265cd72612-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "54874e8f-0cb8-4295-a7f8-b6265cd72612" (UID: "54874e8f-0cb8-4295-a7f8-b6265cd72612"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.319460 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-scripts" (OuterVolumeSpecName: "scripts") pod "54874e8f-0cb8-4295-a7f8-b6265cd72612" (UID: "54874e8f-0cb8-4295-a7f8-b6265cd72612"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.320767 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54874e8f-0cb8-4295-a7f8-b6265cd72612-kube-api-access-kwnhw" (OuterVolumeSpecName: "kube-api-access-kwnhw") pod "54874e8f-0cb8-4295-a7f8-b6265cd72612" (UID: "54874e8f-0cb8-4295-a7f8-b6265cd72612"). InnerVolumeSpecName "kube-api-access-kwnhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.321536 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "54874e8f-0cb8-4295-a7f8-b6265cd72612" (UID: "54874e8f-0cb8-4295-a7f8-b6265cd72612"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.327618 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961" path="/var/lib/kubelet/pods/29ea2f5e-5a1d-46d7-a8b5-9caaef8a5961/volumes" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.390554 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54874e8f-0cb8-4295-a7f8-b6265cd72612" (UID: "54874e8f-0cb8-4295-a7f8-b6265cd72612"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.400055 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.400086 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54874e8f-0cb8-4295-a7f8-b6265cd72612-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.400099 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.400113 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwnhw\" (UniqueName: \"kubernetes.io/projected/54874e8f-0cb8-4295-a7f8-b6265cd72612-kube-api-access-kwnhw\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.400125 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.441143 4922 generic.go:334] "Generic (PLEG): container finished" podID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerID="0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee" exitCode=0 Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.441209 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"54874e8f-0cb8-4295-a7f8-b6265cd72612","Type":"ContainerDied","Data":"0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee"} Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.441288 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"54874e8f-0cb8-4295-a7f8-b6265cd72612","Type":"ContainerDied","Data":"a0638a84398d8a56bb6116b220366edcb984f4bab9158606fdbb7398131db132"} Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.441314 4922 scope.go:117] "RemoveContainer" containerID="c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.442931 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.449899 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data" (OuterVolumeSpecName: "config-data") pod "54874e8f-0cb8-4295-a7f8-b6265cd72612" (UID: "54874e8f-0cb8-4295-a7f8-b6265cd72612"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.467968 4922 scope.go:117] "RemoveContainer" containerID="0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.499441 4922 scope.go:117] "RemoveContainer" containerID="c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a" Nov 22 03:11:13 crc kubenswrapper[4922]: E1122 03:11:13.499836 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a\": container with ID starting with c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a not found: ID does not exist" containerID="c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.499909 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a"} err="failed to get container status \"c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a\": rpc error: code = NotFound desc = could not find container \"c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a\": container with ID starting with c92550f82d065d57a34173c313c19aa17604ec921dc9aa056f9dfc92e50f502a not found: ID does not exist" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.499946 4922 scope.go:117] "RemoveContainer" containerID="0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee" Nov 22 03:11:13 crc kubenswrapper[4922]: E1122 03:11:13.500244 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee\": container with ID starting with 0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee not found: ID does not exist" containerID="0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.500277 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee"} err="failed to get container status \"0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee\": rpc error: code = NotFound desc = could not find container \"0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee\": container with ID starting with 0e57051ba2f5d17c005570fa3221899cd1f3f3dfa77e5f19c298be166c7bfcee not found: ID does not exist" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.501223 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54874e8f-0cb8-4295-a7f8-b6265cd72612-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.773017 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.781666 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.792889 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:13 crc kubenswrapper[4922]: E1122 03:11:13.793237 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="probe" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.793253 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="probe" Nov 22 03:11:13 crc kubenswrapper[4922]: E1122 03:11:13.793271 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="cinder-scheduler" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.793279 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="cinder-scheduler" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.793466 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="cinder-scheduler" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.793482 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" containerName="probe" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.795135 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.796913 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.834415 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.907287 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.907352 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhvt6\" (UniqueName: \"kubernetes.io/projected/85809174-7801-455c-8ce6-82f34307147b-kube-api-access-fhvt6\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.907448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.907502 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-scripts\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.907543 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-config-data\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:13 crc kubenswrapper[4922]: I1122 03:11:13.907584 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85809174-7801-455c-8ce6-82f34307147b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009079 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-scripts\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009397 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-config-data\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009436 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85809174-7801-455c-8ce6-82f34307147b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009481 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009517 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhvt6\" (UniqueName: \"kubernetes.io/projected/85809174-7801-455c-8ce6-82f34307147b-kube-api-access-fhvt6\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009599 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.009667 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85809174-7801-455c-8ce6-82f34307147b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.014863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.017333 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.021738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-config-data\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.024519 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85809174-7801-455c-8ce6-82f34307147b-scripts\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.037210 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhvt6\" (UniqueName: \"kubernetes.io/projected/85809174-7801-455c-8ce6-82f34307147b-kube-api-access-fhvt6\") pod \"cinder-scheduler-0\" (UID: \"85809174-7801-455c-8ce6-82f34307147b\") " pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.119233 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.454950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerStarted","Data":"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960"} Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.455009 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-central-agent" containerID="cri-o://2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" gracePeriod=30 Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.455225 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.455274 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="proxy-httpd" containerID="cri-o://18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" gracePeriod=30 Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.455339 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-notification-agent" containerID="cri-o://835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" gracePeriod=30 Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.455373 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="sg-core" containerID="cri-o://1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" gracePeriod=30 Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.476926 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.250777324 podStartE2EDuration="5.476908528s" podCreationTimestamp="2025-11-22 03:11:09 +0000 UTC" firstStartedPulling="2025-11-22 03:11:10.317017852 +0000 UTC m=+1106.355539734" lastFinishedPulling="2025-11-22 03:11:13.543149046 +0000 UTC m=+1109.581670938" observedRunningTime="2025-11-22 03:11:14.472058612 +0000 UTC m=+1110.510580504" watchObservedRunningTime="2025-11-22 03:11:14.476908528 +0000 UTC m=+1110.515430420" Nov 22 03:11:14 crc kubenswrapper[4922]: I1122 03:11:14.592596 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 03:11:14 crc kubenswrapper[4922]: W1122 03:11:14.602411 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85809174_7801_455c_8ce6_82f34307147b.slice/crio-64cae23229bbb5cf11dc272adaf704e564eca5d4ca1f71e875bf2af1e5d93f63 WatchSource:0}: Error finding container 64cae23229bbb5cf11dc272adaf704e564eca5d4ca1f71e875bf2af1e5d93f63: Status 404 returned error can't find the container with id 64cae23229bbb5cf11dc272adaf704e564eca5d4ca1f71e875bf2af1e5d93f63 Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.275437 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.325373 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54874e8f-0cb8-4295-a7f8-b6265cd72612" path="/var/lib/kubelet/pods/54874e8f-0cb8-4295-a7f8-b6265cd72612/volumes" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.338788 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-combined-ca-bundle\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.338883 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-log-httpd\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.338996 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-scripts\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.339047 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-run-httpd\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.339101 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-config-data\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.339138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdfrq\" (UniqueName: \"kubernetes.io/projected/7fe05513-2eda-43de-b464-3731563f64a5-kube-api-access-rdfrq\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.339204 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-sg-core-conf-yaml\") pod \"7fe05513-2eda-43de-b464-3731563f64a5\" (UID: \"7fe05513-2eda-43de-b464-3731563f64a5\") " Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.339678 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.340533 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.349537 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe05513-2eda-43de-b464-3731563f64a5-kube-api-access-rdfrq" (OuterVolumeSpecName: "kube-api-access-rdfrq") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "kube-api-access-rdfrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.362732 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-scripts" (OuterVolumeSpecName: "scripts") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.419297 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.440962 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdfrq\" (UniqueName: \"kubernetes.io/projected/7fe05513-2eda-43de-b464-3731563f64a5-kube-api-access-rdfrq\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.440987 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.440996 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.441005 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.441016 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7fe05513-2eda-43de-b464-3731563f64a5-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474766 4922 generic.go:334] "Generic (PLEG): container finished" podID="7fe05513-2eda-43de-b464-3731563f64a5" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" exitCode=0 Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474808 4922 generic.go:334] "Generic (PLEG): container finished" podID="7fe05513-2eda-43de-b464-3731563f64a5" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" exitCode=2 Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474820 4922 generic.go:334] "Generic (PLEG): container finished" podID="7fe05513-2eda-43de-b464-3731563f64a5" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" exitCode=0 Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474833 4922 generic.go:334] "Generic (PLEG): container finished" podID="7fe05513-2eda-43de-b464-3731563f64a5" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" exitCode=0 Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474893 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerDied","Data":"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerDied","Data":"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474940 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerDied","Data":"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerDied","Data":"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474959 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7fe05513-2eda-43de-b464-3731563f64a5","Type":"ContainerDied","Data":"f219096a500b1c1aed3e1878b1a9c9e5cb2f4fe8271b1b158ad41d015175357a"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.474976 4922 scope.go:117] "RemoveContainer" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.475146 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.488946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-config-data" (OuterVolumeSpecName: "config-data") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.498380 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85809174-7801-455c-8ce6-82f34307147b","Type":"ContainerStarted","Data":"579f14e0e5433ae73314266cbd8b04477e967a614a61e6e5a8054f2ca507ccec"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.498449 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85809174-7801-455c-8ce6-82f34307147b","Type":"ContainerStarted","Data":"64cae23229bbb5cf11dc272adaf704e564eca5d4ca1f71e875bf2af1e5d93f63"} Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.519131 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fe05513-2eda-43de-b464-3731563f64a5" (UID: "7fe05513-2eda-43de-b464-3731563f64a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.522106 4922 scope.go:117] "RemoveContainer" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.543122 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.543157 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe05513-2eda-43de-b464-3731563f64a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.560023 4922 scope.go:117] "RemoveContainer" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.582289 4922 scope.go:117] "RemoveContainer" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.615129 4922 scope.go:117] "RemoveContainer" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.615709 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": container with ID starting with 18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960 not found: ID does not exist" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.615801 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960"} err="failed to get container status \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": rpc error: code = NotFound desc = could not find container \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": container with ID starting with 18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.615932 4922 scope.go:117] "RemoveContainer" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.616546 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": container with ID starting with 1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184 not found: ID does not exist" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.616590 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184"} err="failed to get container status \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": rpc error: code = NotFound desc = could not find container \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": container with ID starting with 1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.616632 4922 scope.go:117] "RemoveContainer" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.619791 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": container with ID starting with 835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76 not found: ID does not exist" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.619826 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76"} err="failed to get container status \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": rpc error: code = NotFound desc = could not find container \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": container with ID starting with 835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.619860 4922 scope.go:117] "RemoveContainer" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.621153 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": container with ID starting with 2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70 not found: ID does not exist" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.621182 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70"} err="failed to get container status \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": rpc error: code = NotFound desc = could not find container \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": container with ID starting with 2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.621197 4922 scope.go:117] "RemoveContainer" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.621714 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960"} err="failed to get container status \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": rpc error: code = NotFound desc = could not find container \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": container with ID starting with 18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.622575 4922 scope.go:117] "RemoveContainer" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.623018 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184"} err="failed to get container status \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": rpc error: code = NotFound desc = could not find container \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": container with ID starting with 1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.623146 4922 scope.go:117] "RemoveContainer" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.623618 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76"} err="failed to get container status \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": rpc error: code = NotFound desc = could not find container \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": container with ID starting with 835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.623701 4922 scope.go:117] "RemoveContainer" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.624265 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70"} err="failed to get container status \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": rpc error: code = NotFound desc = could not find container \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": container with ID starting with 2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.624315 4922 scope.go:117] "RemoveContainer" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.625132 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960"} err="failed to get container status \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": rpc error: code = NotFound desc = could not find container \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": container with ID starting with 18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.625160 4922 scope.go:117] "RemoveContainer" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.625667 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184"} err="failed to get container status \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": rpc error: code = NotFound desc = could not find container \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": container with ID starting with 1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.625690 4922 scope.go:117] "RemoveContainer" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.626106 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76"} err="failed to get container status \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": rpc error: code = NotFound desc = could not find container \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": container with ID starting with 835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.626129 4922 scope.go:117] "RemoveContainer" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.627697 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70"} err="failed to get container status \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": rpc error: code = NotFound desc = could not find container \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": container with ID starting with 2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.627722 4922 scope.go:117] "RemoveContainer" containerID="18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.628016 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960"} err="failed to get container status \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": rpc error: code = NotFound desc = could not find container \"18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960\": container with ID starting with 18603f57c476c794b60782a62c599416a12223cbfdc85f192b672b9863ff9960 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.628035 4922 scope.go:117] "RemoveContainer" containerID="1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.631975 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184"} err="failed to get container status \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": rpc error: code = NotFound desc = could not find container \"1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184\": container with ID starting with 1cf35abef1b3cac7c7050a07cfe8d918b84546eff6314fd9c305423c8dd9c184 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.632017 4922 scope.go:117] "RemoveContainer" containerID="835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.636681 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76"} err="failed to get container status \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": rpc error: code = NotFound desc = could not find container \"835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76\": container with ID starting with 835bc98714a6e97552426e946c0ac423bbe9bf03720ab39f3367fa89f15aab76 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.636875 4922 scope.go:117] "RemoveContainer" containerID="2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.637468 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70"} err="failed to get container status \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": rpc error: code = NotFound desc = could not find container \"2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70\": container with ID starting with 2ae17157a2140b85df784a16c83050e950d8404eda7b113559c9efe223dfbe70 not found: ID does not exist" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.816356 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.827992 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.842644 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.843191 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-notification-agent" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843208 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-notification-agent" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.843222 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="proxy-httpd" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843228 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="proxy-httpd" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.843238 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-central-agent" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843244 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-central-agent" Nov 22 03:11:15 crc kubenswrapper[4922]: E1122 03:11:15.843251 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="sg-core" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843259 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="sg-core" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843425 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="proxy-httpd" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843440 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-central-agent" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843452 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="ceilometer-notification-agent" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.843461 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe05513-2eda-43de-b464-3731563f64a5" containerName="sg-core" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.852507 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.856152 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.856754 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.864959 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.958357 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.958411 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-log-httpd\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.958433 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-config-data\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.958450 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-scripts\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.959201 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.959883 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gk5x\" (UniqueName: \"kubernetes.io/projected/9f20fb6f-76af-41c5-99a1-e13510351578-kube-api-access-7gk5x\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:15 crc kubenswrapper[4922]: I1122 03:11:15.959940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-run-httpd\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062088 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062156 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-log-httpd\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062871 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-config-data\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062902 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-scripts\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062954 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062971 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gk5x\" (UniqueName: \"kubernetes.io/projected/9f20fb6f-76af-41c5-99a1-e13510351578-kube-api-access-7gk5x\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.062999 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-run-httpd\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.063164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-log-httpd\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.064090 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-run-httpd\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.070147 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-scripts\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.070228 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.070470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-config-data\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.070478 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.090346 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gk5x\" (UniqueName: \"kubernetes.io/projected/9f20fb6f-76af-41c5-99a1-e13510351578-kube-api-access-7gk5x\") pod \"ceilometer-0\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.189690 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.512289 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"85809174-7801-455c-8ce6-82f34307147b","Type":"ContainerStarted","Data":"403d62484d08c063d2a39f5b18fd24e1591e748f551423ae27516bccf0ed990d"} Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.537397 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.53737969 podStartE2EDuration="3.53737969s" podCreationTimestamp="2025-11-22 03:11:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:16.52780777 +0000 UTC m=+1112.566329672" watchObservedRunningTime="2025-11-22 03:11:16.53737969 +0000 UTC m=+1112.575901582" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.623557 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.683759 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:16 crc kubenswrapper[4922]: I1122 03:11:16.789739 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:17 crc kubenswrapper[4922]: I1122 03:11:17.317281 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe05513-2eda-43de-b464-3731563f64a5" path="/var/lib/kubelet/pods/7fe05513-2eda-43de-b464-3731563f64a5/volumes" Nov 22 03:11:19 crc kubenswrapper[4922]: I1122 03:11:19.119647 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 03:11:19 crc kubenswrapper[4922]: I1122 03:11:19.577392 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:11:21 crc kubenswrapper[4922]: I1122 03:11:21.567971 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerStarted","Data":"7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73"} Nov 22 03:11:21 crc kubenswrapper[4922]: I1122 03:11:21.568409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerStarted","Data":"a779b7afabf25979f1f3b038a752a46b6c2eccc1852065f45047a794d76010e9"} Nov 22 03:11:21 crc kubenswrapper[4922]: I1122 03:11:21.569958 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" event={"ID":"5b0138cd-b295-44c3-9f0b-1d5de2c8d144","Type":"ContainerStarted","Data":"4d9b604e74096d115ac988a940295dc538891acaadf8c4dca4ccb3e2d3e1f9c6"} Nov 22 03:11:21 crc kubenswrapper[4922]: I1122 03:11:21.599597 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" podStartSLOduration=1.986492961 podStartE2EDuration="11.599579315s" podCreationTimestamp="2025-11-22 03:11:10 +0000 UTC" firstStartedPulling="2025-11-22 03:11:11.322883874 +0000 UTC m=+1107.361405766" lastFinishedPulling="2025-11-22 03:11:20.935970238 +0000 UTC m=+1116.974492120" observedRunningTime="2025-11-22 03:11:21.59560767 +0000 UTC m=+1117.634129602" watchObservedRunningTime="2025-11-22 03:11:21.599579315 +0000 UTC m=+1117.638101207" Nov 22 03:11:22 crc kubenswrapper[4922]: I1122 03:11:22.328368 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-655bcfccf7-54vbt" Nov 22 03:11:22 crc kubenswrapper[4922]: I1122 03:11:22.408578 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f7d8bcbcb-r2n98"] Nov 22 03:11:22 crc kubenswrapper[4922]: I1122 03:11:22.409011 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f7d8bcbcb-r2n98" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-httpd" containerID="cri-o://4539b8421ec795772c4e11556c6d2b3032d0278527c4976eb1c1092a96a32a4e" gracePeriod=30 Nov 22 03:11:22 crc kubenswrapper[4922]: I1122 03:11:22.409390 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f7d8bcbcb-r2n98" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-api" containerID="cri-o://cf91d2200e85eb82afcdd14f2637345a4eea69b297682bfff35a58a992b5c444" gracePeriod=30 Nov 22 03:11:22 crc kubenswrapper[4922]: I1122 03:11:22.580560 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerStarted","Data":"75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b"} Nov 22 03:11:23 crc kubenswrapper[4922]: I1122 03:11:23.593084 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerStarted","Data":"ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d"} Nov 22 03:11:23 crc kubenswrapper[4922]: I1122 03:11:23.595708 4922 generic.go:334] "Generic (PLEG): container finished" podID="36feb788-6145-4d72-b9b2-93a3557704b4" containerID="4539b8421ec795772c4e11556c6d2b3032d0278527c4976eb1c1092a96a32a4e" exitCode=0 Nov 22 03:11:23 crc kubenswrapper[4922]: I1122 03:11:23.595744 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7d8bcbcb-r2n98" event={"ID":"36feb788-6145-4d72-b9b2-93a3557704b4","Type":"ContainerDied","Data":"4539b8421ec795772c4e11556c6d2b3032d0278527c4976eb1c1092a96a32a4e"} Nov 22 03:11:24 crc kubenswrapper[4922]: I1122 03:11:24.322331 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.618582 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerStarted","Data":"ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c"} Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.619212 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.619043 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-central-agent" containerID="cri-o://7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73" gracePeriod=30 Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.619395 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="proxy-httpd" containerID="cri-o://ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c" gracePeriod=30 Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.619600 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="sg-core" containerID="cri-o://ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d" gracePeriod=30 Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.619584 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-notification-agent" containerID="cri-o://75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b" gracePeriod=30 Nov 22 03:11:25 crc kubenswrapper[4922]: I1122 03:11:25.660624 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.8353206459999996 podStartE2EDuration="10.660602631s" podCreationTimestamp="2025-11-22 03:11:15 +0000 UTC" firstStartedPulling="2025-11-22 03:11:20.868796528 +0000 UTC m=+1116.907318460" lastFinishedPulling="2025-11-22 03:11:24.694078533 +0000 UTC m=+1120.732600445" observedRunningTime="2025-11-22 03:11:25.652734582 +0000 UTC m=+1121.691256514" watchObservedRunningTime="2025-11-22 03:11:25.660602631 +0000 UTC m=+1121.699124523" Nov 22 03:11:26 crc kubenswrapper[4922]: I1122 03:11:26.633908 4922 generic.go:334] "Generic (PLEG): container finished" podID="9f20fb6f-76af-41c5-99a1-e13510351578" containerID="ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c" exitCode=0 Nov 22 03:11:26 crc kubenswrapper[4922]: I1122 03:11:26.633955 4922 generic.go:334] "Generic (PLEG): container finished" podID="9f20fb6f-76af-41c5-99a1-e13510351578" containerID="ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d" exitCode=2 Nov 22 03:11:26 crc kubenswrapper[4922]: I1122 03:11:26.633974 4922 generic.go:334] "Generic (PLEG): container finished" podID="9f20fb6f-76af-41c5-99a1-e13510351578" containerID="75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b" exitCode=0 Nov 22 03:11:26 crc kubenswrapper[4922]: I1122 03:11:26.634004 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerDied","Data":"ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c"} Nov 22 03:11:26 crc kubenswrapper[4922]: I1122 03:11:26.634043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerDied","Data":"ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d"} Nov 22 03:11:26 crc kubenswrapper[4922]: I1122 03:11:26.634062 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerDied","Data":"75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b"} Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.225262 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377468 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-log-httpd\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377511 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-combined-ca-bundle\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377533 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gk5x\" (UniqueName: \"kubernetes.io/projected/9f20fb6f-76af-41c5-99a1-e13510351578-kube-api-access-7gk5x\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377554 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-scripts\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377579 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-sg-core-conf-yaml\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377625 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-run-httpd\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.377654 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-config-data\") pod \"9f20fb6f-76af-41c5-99a1-e13510351578\" (UID: \"9f20fb6f-76af-41c5-99a1-e13510351578\") " Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.379254 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.379952 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.384395 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-scripts" (OuterVolumeSpecName: "scripts") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.385167 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f20fb6f-76af-41c5-99a1-e13510351578-kube-api-access-7gk5x" (OuterVolumeSpecName: "kube-api-access-7gk5x") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "kube-api-access-7gk5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.424091 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479065 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479588 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479625 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f20fb6f-76af-41c5-99a1-e13510351578-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479643 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479663 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gk5x\" (UniqueName: \"kubernetes.io/projected/9f20fb6f-76af-41c5-99a1-e13510351578-kube-api-access-7gk5x\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479682 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.479698 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.493237 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-config-data" (OuterVolumeSpecName: "config-data") pod "9f20fb6f-76af-41c5-99a1-e13510351578" (UID: "9f20fb6f-76af-41c5-99a1-e13510351578"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.580613 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f20fb6f-76af-41c5-99a1-e13510351578-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.648492 4922 generic.go:334] "Generic (PLEG): container finished" podID="9f20fb6f-76af-41c5-99a1-e13510351578" containerID="7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73" exitCode=0 Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.648551 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerDied","Data":"7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73"} Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.648602 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.648628 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f20fb6f-76af-41c5-99a1-e13510351578","Type":"ContainerDied","Data":"a779b7afabf25979f1f3b038a752a46b6c2eccc1852065f45047a794d76010e9"} Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.648662 4922 scope.go:117] "RemoveContainer" containerID="ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.695031 4922 scope.go:117] "RemoveContainer" containerID="ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.703191 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.726989 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.738446 4922 scope.go:117] "RemoveContainer" containerID="75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.740620 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.741264 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-central-agent" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741301 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-central-agent" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.741324 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="sg-core" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741335 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="sg-core" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.741363 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-notification-agent" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741374 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-notification-agent" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.741413 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="proxy-httpd" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741455 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="proxy-httpd" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741708 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-notification-agent" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741740 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="sg-core" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741761 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="ceilometer-central-agent" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.741773 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" containerName="proxy-httpd" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.744218 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.748952 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.749294 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.771683 4922 scope.go:117] "RemoveContainer" containerID="7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.771741 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796590 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796645 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796678 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-scripts\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796704 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-run-httpd\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796739 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-log-httpd\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796802 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-config-data\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.796826 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6tlc\" (UniqueName: \"kubernetes.io/projected/07379147-ac76-4a20-bcdc-dc13fceaaabc-kube-api-access-w6tlc\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.833838 4922 scope.go:117] "RemoveContainer" containerID="ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.834433 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c\": container with ID starting with ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c not found: ID does not exist" containerID="ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.834498 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c"} err="failed to get container status \"ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c\": rpc error: code = NotFound desc = could not find container \"ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c\": container with ID starting with ce5834d4e05bd2340fe758cc347f29e803dee33b1791b6d604b0f4f729e56f3c not found: ID does not exist" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.834538 4922 scope.go:117] "RemoveContainer" containerID="ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.835215 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d\": container with ID starting with ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d not found: ID does not exist" containerID="ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.835305 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d"} err="failed to get container status \"ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d\": rpc error: code = NotFound desc = could not find container \"ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d\": container with ID starting with ed5d0b3a64c855b675bbdf15fee85a44af7aed9c8c020dcf8fe44696828eab9d not found: ID does not exist" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.835365 4922 scope.go:117] "RemoveContainer" containerID="75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.836007 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b\": container with ID starting with 75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b not found: ID does not exist" containerID="75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.836051 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b"} err="failed to get container status \"75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b\": rpc error: code = NotFound desc = could not find container \"75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b\": container with ID starting with 75e44f87204fa2d0ffe9f8b72e5e835ffc930074cf59c83d32187c57029a669b not found: ID does not exist" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.836161 4922 scope.go:117] "RemoveContainer" containerID="7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73" Nov 22 03:11:27 crc kubenswrapper[4922]: E1122 03:11:27.836735 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73\": container with ID starting with 7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73 not found: ID does not exist" containerID="7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.836808 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73"} err="failed to get container status \"7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73\": rpc error: code = NotFound desc = could not find container \"7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73\": container with ID starting with 7060a877d83463cbbec742ee344c5ca62d6180bc5b18807a85292b5dd9c58d73 not found: ID does not exist" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.898975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-config-data\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899026 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6tlc\" (UniqueName: \"kubernetes.io/projected/07379147-ac76-4a20-bcdc-dc13fceaaabc-kube-api-access-w6tlc\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899121 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899159 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-scripts\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-run-httpd\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899269 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-log-httpd\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.899724 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-log-httpd\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.900454 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-run-httpd\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.903618 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.904335 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-config-data\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.904529 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-scripts\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.904565 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:27 crc kubenswrapper[4922]: I1122 03:11:27.914353 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6tlc\" (UniqueName: \"kubernetes.io/projected/07379147-ac76-4a20-bcdc-dc13fceaaabc-kube-api-access-w6tlc\") pod \"ceilometer-0\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " pod="openstack/ceilometer-0" Nov 22 03:11:28 crc kubenswrapper[4922]: I1122 03:11:28.074924 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:11:28 crc kubenswrapper[4922]: I1122 03:11:28.601455 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:11:28 crc kubenswrapper[4922]: I1122 03:11:28.663564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerStarted","Data":"8a119571f5a89bd335b8d7b6704849ff683df123c4584a99e927a0208f612562"} Nov 22 03:11:29 crc kubenswrapper[4922]: I1122 03:11:29.312687 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f20fb6f-76af-41c5-99a1-e13510351578" path="/var/lib/kubelet/pods/9f20fb6f-76af-41c5-99a1-e13510351578/volumes" Nov 22 03:11:29 crc kubenswrapper[4922]: I1122 03:11:29.678275 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerStarted","Data":"35bc425ba87ca523eb68fd449264576ca529b7063a2662d0cceea92c76b0a9d9"} Nov 22 03:11:30 crc kubenswrapper[4922]: I1122 03:11:30.687403 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerStarted","Data":"2e5495a2cec9e3011bb6865e8f88624777bd0b56e6edf7452eab3590d846420b"} Nov 22 03:11:30 crc kubenswrapper[4922]: I1122 03:11:30.687704 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerStarted","Data":"788589c18f9d24d3d0050d80f0262085550508caa9bdfde6126b1dea8af83c4c"} Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.709106 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerStarted","Data":"c01af40e744cf71456441bd5823419e79045d9e7d1127d39435a58abece52700"} Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.709592 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.712469 4922 generic.go:334] "Generic (PLEG): container finished" podID="36feb788-6145-4d72-b9b2-93a3557704b4" containerID="cf91d2200e85eb82afcdd14f2637345a4eea69b297682bfff35a58a992b5c444" exitCode=0 Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.712524 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7d8bcbcb-r2n98" event={"ID":"36feb788-6145-4d72-b9b2-93a3557704b4","Type":"ContainerDied","Data":"cf91d2200e85eb82afcdd14f2637345a4eea69b297682bfff35a58a992b5c444"} Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.716629 4922 generic.go:334] "Generic (PLEG): container finished" podID="5b0138cd-b295-44c3-9f0b-1d5de2c8d144" containerID="4d9b604e74096d115ac988a940295dc538891acaadf8c4dca4ccb3e2d3e1f9c6" exitCode=0 Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.716692 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" event={"ID":"5b0138cd-b295-44c3-9f0b-1d5de2c8d144","Type":"ContainerDied","Data":"4d9b604e74096d115ac988a940295dc538891acaadf8c4dca4ccb3e2d3e1f9c6"} Nov 22 03:11:32 crc kubenswrapper[4922]: I1122 03:11:32.747709 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.763075131 podStartE2EDuration="5.747678384s" podCreationTimestamp="2025-11-22 03:11:27 +0000 UTC" firstStartedPulling="2025-11-22 03:11:28.607744586 +0000 UTC m=+1124.646266518" lastFinishedPulling="2025-11-22 03:11:31.592347869 +0000 UTC m=+1127.630869771" observedRunningTime="2025-11-22 03:11:32.73915988 +0000 UTC m=+1128.777681812" watchObservedRunningTime="2025-11-22 03:11:32.747678384 +0000 UTC m=+1128.786200316" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.224870 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.401657 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-httpd-config\") pod \"36feb788-6145-4d72-b9b2-93a3557704b4\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.401745 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-config\") pod \"36feb788-6145-4d72-b9b2-93a3557704b4\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.401772 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-combined-ca-bundle\") pod \"36feb788-6145-4d72-b9b2-93a3557704b4\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.401879 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-ovndb-tls-certs\") pod \"36feb788-6145-4d72-b9b2-93a3557704b4\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.402000 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcs28\" (UniqueName: \"kubernetes.io/projected/36feb788-6145-4d72-b9b2-93a3557704b4-kube-api-access-bcs28\") pod \"36feb788-6145-4d72-b9b2-93a3557704b4\" (UID: \"36feb788-6145-4d72-b9b2-93a3557704b4\") " Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.410175 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "36feb788-6145-4d72-b9b2-93a3557704b4" (UID: "36feb788-6145-4d72-b9b2-93a3557704b4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.412164 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36feb788-6145-4d72-b9b2-93a3557704b4-kube-api-access-bcs28" (OuterVolumeSpecName: "kube-api-access-bcs28") pod "36feb788-6145-4d72-b9b2-93a3557704b4" (UID: "36feb788-6145-4d72-b9b2-93a3557704b4"). InnerVolumeSpecName "kube-api-access-bcs28". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.454261 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-config" (OuterVolumeSpecName: "config") pod "36feb788-6145-4d72-b9b2-93a3557704b4" (UID: "36feb788-6145-4d72-b9b2-93a3557704b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.467255 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36feb788-6145-4d72-b9b2-93a3557704b4" (UID: "36feb788-6145-4d72-b9b2-93a3557704b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.482453 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "36feb788-6145-4d72-b9b2-93a3557704b4" (UID: "36feb788-6145-4d72-b9b2-93a3557704b4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.503622 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcs28\" (UniqueName: \"kubernetes.io/projected/36feb788-6145-4d72-b9b2-93a3557704b4-kube-api-access-bcs28\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.503664 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.503677 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.503713 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.503721 4922 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36feb788-6145-4d72-b9b2-93a3557704b4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.733713 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f7d8bcbcb-r2n98" event={"ID":"36feb788-6145-4d72-b9b2-93a3557704b4","Type":"ContainerDied","Data":"8e66c307ac52c33adef1f5de4a52b8e87c210833524826b874fec3fa011179f8"} Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.733765 4922 scope.go:117] "RemoveContainer" containerID="4539b8421ec795772c4e11556c6d2b3032d0278527c4976eb1c1092a96a32a4e" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.733892 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f7d8bcbcb-r2n98" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.771017 4922 scope.go:117] "RemoveContainer" containerID="cf91d2200e85eb82afcdd14f2637345a4eea69b297682bfff35a58a992b5c444" Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.775030 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f7d8bcbcb-r2n98"] Nov 22 03:11:33 crc kubenswrapper[4922]: I1122 03:11:33.783566 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f7d8bcbcb-r2n98"] Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.082239 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.213214 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-scripts\") pod \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.213301 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc4k8\" (UniqueName: \"kubernetes.io/projected/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-kube-api-access-gc4k8\") pod \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.213341 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-config-data\") pod \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.213443 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-combined-ca-bundle\") pod \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\" (UID: \"5b0138cd-b295-44c3-9f0b-1d5de2c8d144\") " Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.220232 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-kube-api-access-gc4k8" (OuterVolumeSpecName: "kube-api-access-gc4k8") pod "5b0138cd-b295-44c3-9f0b-1d5de2c8d144" (UID: "5b0138cd-b295-44c3-9f0b-1d5de2c8d144"). InnerVolumeSpecName "kube-api-access-gc4k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.226026 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-scripts" (OuterVolumeSpecName: "scripts") pod "5b0138cd-b295-44c3-9f0b-1d5de2c8d144" (UID: "5b0138cd-b295-44c3-9f0b-1d5de2c8d144"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.244218 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-config-data" (OuterVolumeSpecName: "config-data") pod "5b0138cd-b295-44c3-9f0b-1d5de2c8d144" (UID: "5b0138cd-b295-44c3-9f0b-1d5de2c8d144"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.261045 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b0138cd-b295-44c3-9f0b-1d5de2c8d144" (UID: "5b0138cd-b295-44c3-9f0b-1d5de2c8d144"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.315607 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.315638 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.315649 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc4k8\" (UniqueName: \"kubernetes.io/projected/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-kube-api-access-gc4k8\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.315659 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0138cd-b295-44c3-9f0b-1d5de2c8d144-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.745004 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" event={"ID":"5b0138cd-b295-44c3-9f0b-1d5de2c8d144","Type":"ContainerDied","Data":"f9cceb699a929561bdf17b9e26f55141af404694267998f7d3c190730abd656d"} Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.745361 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9cceb699a929561bdf17b9e26f55141af404694267998f7d3c190730abd656d" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.745067 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h8fkq" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.896418 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 03:11:34 crc kubenswrapper[4922]: E1122 03:11:34.896914 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0138cd-b295-44c3-9f0b-1d5de2c8d144" containerName="nova-cell0-conductor-db-sync" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.896945 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0138cd-b295-44c3-9f0b-1d5de2c8d144" containerName="nova-cell0-conductor-db-sync" Nov 22 03:11:34 crc kubenswrapper[4922]: E1122 03:11:34.896966 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-httpd" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.896977 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-httpd" Nov 22 03:11:34 crc kubenswrapper[4922]: E1122 03:11:34.897039 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-api" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.897052 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-api" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.897301 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-api" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.897343 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0138cd-b295-44c3-9f0b-1d5de2c8d144" containerName="nova-cell0-conductor-db-sync" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.897393 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" containerName="neutron-httpd" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.903126 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.905752 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-564hd" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.906165 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 03:11:34 crc kubenswrapper[4922]: I1122 03:11:34.909835 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.029600 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.030163 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbnj\" (UniqueName: \"kubernetes.io/projected/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-kube-api-access-hnbnj\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.030441 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.132394 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.132537 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbnj\" (UniqueName: \"kubernetes.io/projected/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-kube-api-access-hnbnj\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.132589 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.144754 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.147872 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.163329 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbnj\" (UniqueName: \"kubernetes.io/projected/de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3-kube-api-access-hnbnj\") pod \"nova-cell0-conductor-0\" (UID: \"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3\") " pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.238310 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.320127 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36feb788-6145-4d72-b9b2-93a3557704b4" path="/var/lib/kubelet/pods/36feb788-6145-4d72-b9b2-93a3557704b4/volumes" Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.710781 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 03:11:35 crc kubenswrapper[4922]: W1122 03:11:35.718180 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde63ff2e_da3d_4eb5_97a2_4cf6c0b272d3.slice/crio-82b13b264e7a0525eef516a718215ea1849c184bf552e81b86b64c770b201928 WatchSource:0}: Error finding container 82b13b264e7a0525eef516a718215ea1849c184bf552e81b86b64c770b201928: Status 404 returned error can't find the container with id 82b13b264e7a0525eef516a718215ea1849c184bf552e81b86b64c770b201928 Nov 22 03:11:35 crc kubenswrapper[4922]: I1122 03:11:35.762829 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3","Type":"ContainerStarted","Data":"82b13b264e7a0525eef516a718215ea1849c184bf552e81b86b64c770b201928"} Nov 22 03:11:36 crc kubenswrapper[4922]: I1122 03:11:36.771347 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3","Type":"ContainerStarted","Data":"ad1f52b365a3087de9dd4cc7eac95d94106aeb21bd6fb07ba0bcaa741f46628a"} Nov 22 03:11:36 crc kubenswrapper[4922]: I1122 03:11:36.771641 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:36 crc kubenswrapper[4922]: I1122 03:11:36.790884 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.790831541 podStartE2EDuration="2.790831541s" podCreationTimestamp="2025-11-22 03:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:36.78492287 +0000 UTC m=+1132.823444762" watchObservedRunningTime="2025-11-22 03:11:36.790831541 +0000 UTC m=+1132.829353463" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.297678 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.825198 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pcvkr"] Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.826577 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.830894 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.842473 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.850640 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pcvkr"] Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.959804 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvx2\" (UniqueName: \"kubernetes.io/projected/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-kube-api-access-8rvx2\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.960107 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-config-data\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.960271 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-scripts\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:40 crc kubenswrapper[4922]: I1122 03:11:40.960355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.041134 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.042185 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: W1122 03:11:41.044265 4922 reflector.go:561] object-"openstack"/"nova-scheduler-config-data": failed to list *v1.Secret: secrets "nova-scheduler-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Nov 22 03:11:41 crc kubenswrapper[4922]: E1122 03:11:41.044300 4922 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"nova-scheduler-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nova-scheduler-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.062307 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-config-data\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.062363 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-scripts\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.062386 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.062451 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvx2\" (UniqueName: \"kubernetes.io/projected/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-kube-api-access-8rvx2\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.070058 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.071165 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.077550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-config-data\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.082081 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-scripts\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.091781 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvx2\" (UniqueName: \"kubernetes.io/projected/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-kube-api-access-8rvx2\") pod \"nova-cell0-cell-mapping-pcvkr\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.109478 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.109520 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.109559 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.110185 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a719d1a6509057cf085f1c2768ec28cb47fdbdd817caffc2f3d5d452e6b5e16a"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.110233 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://a719d1a6509057cf085f1c2768ec28cb47fdbdd817caffc2f3d5d452e6b5e16a" gracePeriod=600 Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.145572 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.147048 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.149751 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.162365 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.163427 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jl4m\" (UniqueName: \"kubernetes.io/projected/b62f1703-3090-40d6-86e2-3afef486e933-kube-api-access-2jl4m\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.163579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.163610 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.163676 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.165787 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.187712 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.201025 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.218088 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.240672 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.254296 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.257484 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266410 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266454 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lspg9\" (UniqueName: \"kubernetes.io/projected/2b2ab3b2-f030-4140-9b7c-94004fd07915-kube-api-access-lspg9\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266480 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b2ab3b2-f030-4140-9b7c-94004fd07915-logs\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266509 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jl4m\" (UniqueName: \"kubernetes.io/projected/b62f1703-3090-40d6-86e2-3afef486e933-kube-api-access-2jl4m\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266557 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266580 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266609 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-config-data\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266666 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf5lq\" (UniqueName: \"kubernetes.io/projected/64048346-bcee-40da-af93-b7fbb844c1f9-kube-api-access-nf5lq\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.266708 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.268576 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.281347 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.324459 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jl4m\" (UniqueName: \"kubernetes.io/projected/b62f1703-3090-40d6-86e2-3afef486e933-kube-api-access-2jl4m\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf5lq\" (UniqueName: \"kubernetes.io/projected/64048346-bcee-40da-af93-b7fbb844c1f9-kube-api-access-nf5lq\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367807 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lspg9\" (UniqueName: \"kubernetes.io/projected/2b2ab3b2-f030-4140-9b7c-94004fd07915-kube-api-access-lspg9\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367826 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b2ab3b2-f030-4140-9b7c-94004fd07915-logs\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-logs\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367898 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367929 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-config-data\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367955 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.367985 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.368001 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-config-data\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.368018 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6rgx\" (UniqueName: \"kubernetes.io/projected/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-kube-api-access-r6rgx\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.369004 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b2ab3b2-f030-4140-9b7c-94004fd07915-logs\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.369261 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-gzqf2"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.370700 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.384319 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.386699 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.389425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lspg9\" (UniqueName: \"kubernetes.io/projected/2b2ab3b2-f030-4140-9b7c-94004fd07915-kube-api-access-lspg9\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.391435 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.392115 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-config-data\") pod \"nova-metadata-0\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.398754 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf5lq\" (UniqueName: \"kubernetes.io/projected/64048346-bcee-40da-af93-b7fbb844c1f9-kube-api-access-nf5lq\") pod \"nova-cell1-novncproxy-0\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.411234 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-gzqf2"] Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469642 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-logs\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469686 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-config-data\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469734 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74k6\" (UniqueName: \"kubernetes.io/projected/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-kube-api-access-g74k6\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469759 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-config\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469803 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469821 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6rgx\" (UniqueName: \"kubernetes.io/projected/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-kube-api-access-r6rgx\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469923 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-dns-svc\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.469953 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.471027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-logs\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.478080 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-config-data\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.478555 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.491393 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6rgx\" (UniqueName: \"kubernetes.io/projected/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-kube-api-access-r6rgx\") pod \"nova-api-0\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.504956 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.516449 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.571127 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-dns-svc\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.571515 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.571991 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g74k6\" (UniqueName: \"kubernetes.io/projected/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-kube-api-access-g74k6\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.572006 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-dns-svc\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.572022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-config\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.572133 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.572704 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-config\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.572853 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.573289 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.595743 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74k6\" (UniqueName: \"kubernetes.io/projected/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-kube-api-access-g74k6\") pod \"dnsmasq-dns-566b5b7845-gzqf2\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.660092 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.715245 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.757068 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pcvkr"] Nov 22 03:11:41 crc kubenswrapper[4922]: W1122 03:11:41.774736 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29a6e29f_a80d_4f7d_8d32_c5a18f26da69.slice/crio-2d94e83987c3fb50e88f6f7ad9a9ab03aaddb9fc16c6af4d120518ce80374a03 WatchSource:0}: Error finding container 2d94e83987c3fb50e88f6f7ad9a9ab03aaddb9fc16c6af4d120518ce80374a03: Status 404 returned error can't find the container with id 2d94e83987c3fb50e88f6f7ad9a9ab03aaddb9fc16c6af4d120518ce80374a03 Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.889827 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pcvkr" event={"ID":"29a6e29f-a80d-4f7d-8d32-c5a18f26da69","Type":"ContainerStarted","Data":"2d94e83987c3fb50e88f6f7ad9a9ab03aaddb9fc16c6af4d120518ce80374a03"} Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.894804 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="a719d1a6509057cf085f1c2768ec28cb47fdbdd817caffc2f3d5d452e6b5e16a" exitCode=0 Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.894849 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"a719d1a6509057cf085f1c2768ec28cb47fdbdd817caffc2f3d5d452e6b5e16a"} Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.894893 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"40d76ae6e06f9784b7400ca69b8cc35e187346fc7004ab985de1150eb16fab3d"} Nov 22 03:11:41 crc kubenswrapper[4922]: I1122 03:11:41.894912 4922 scope.go:117] "RemoveContainer" containerID="e544a691b867a653faf01710d7aaa2e43b953be45a75dda707c262af6da21812" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.054138 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q9xlg"] Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.055295 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.059410 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.059523 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.061314 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q9xlg"] Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.158285 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.170581 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.191968 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9hq\" (UniqueName: \"kubernetes.io/projected/255e57e6-0d59-4288-9070-373e6ce3d77c-kube-api-access-gn9hq\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.192024 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.192045 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-scripts\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.192321 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: E1122 03:11:42.267146 4922 secret.go:188] Couldn't get secret openstack/nova-scheduler-config-data: failed to sync secret cache: timed out waiting for the condition Nov 22 03:11:42 crc kubenswrapper[4922]: E1122 03:11:42.267241 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data podName:b62f1703-3090-40d6-86e2-3afef486e933 nodeName:}" failed. No retries permitted until 2025-11-22 03:11:42.767220706 +0000 UTC m=+1138.805742598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data") pod "nova-scheduler-0" (UID: "b62f1703-3090-40d6-86e2-3afef486e933") : failed to sync secret cache: timed out waiting for the condition Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.296471 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.313733 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.313869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9hq\" (UniqueName: \"kubernetes.io/projected/255e57e6-0d59-4288-9070-373e6ce3d77c-kube-api-access-gn9hq\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.313901 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.313918 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-scripts\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.316783 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.321689 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.321715 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.322010 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-scripts\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.345688 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9hq\" (UniqueName: \"kubernetes.io/projected/255e57e6-0d59-4288-9070-373e6ce3d77c-kube-api-access-gn9hq\") pod \"nova-cell1-conductor-db-sync-q9xlg\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.373300 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.439320 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-gzqf2"] Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.631621 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q9xlg"] Nov 22 03:11:42 crc kubenswrapper[4922]: W1122 03:11:42.637009 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255e57e6_0d59_4288_9070_373e6ce3d77c.slice/crio-f96937392a675156ff1816241ca0ad89c3667c2b434cb27996d53a13a56f6630 WatchSource:0}: Error finding container f96937392a675156ff1816241ca0ad89c3667c2b434cb27996d53a13a56f6630: Status 404 returned error can't find the container with id f96937392a675156ff1816241ca0ad89c3667c2b434cb27996d53a13a56f6630 Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.826935 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.832612 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data\") pod \"nova-scheduler-0\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.856052 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.907456 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" event={"ID":"255e57e6-0d59-4288-9070-373e6ce3d77c","Type":"ContainerStarted","Data":"a51596700e921c5bf64f9df1f6bec6eaf8aa3514a2ec071a771ec3ee7801dd91"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.907501 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" event={"ID":"255e57e6-0d59-4288-9070-373e6ce3d77c","Type":"ContainerStarted","Data":"f96937392a675156ff1816241ca0ad89c3667c2b434cb27996d53a13a56f6630"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.911769 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3","Type":"ContainerStarted","Data":"613ceb0774fe10d43cdb1db486269d4b0f2d1b2b0d125c9023f8ccac07985142"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.916448 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b2ab3b2-f030-4140-9b7c-94004fd07915","Type":"ContainerStarted","Data":"9dc5c64ac7eaea9639a1bce21460b9403557b9a4f9a1ea38f5f17bcf4bf078c7"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.917939 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pcvkr" event={"ID":"29a6e29f-a80d-4f7d-8d32-c5a18f26da69","Type":"ContainerStarted","Data":"59f0c8464dddcc2b9c2e1d38c8cb1d2d0a5a31fff9e78685924efb86600644f7"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.919947 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64048346-bcee-40da-af93-b7fbb844c1f9","Type":"ContainerStarted","Data":"5d36520fab4067c9e11201540da4557b5f32b4d8534e8ca458c73353883f40dd"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.929302 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" podStartSLOduration=0.929289456 podStartE2EDuration="929.289456ms" podCreationTimestamp="2025-11-22 03:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:42.927864571 +0000 UTC m=+1138.966386473" watchObservedRunningTime="2025-11-22 03:11:42.929289456 +0000 UTC m=+1138.967811348" Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.964976 4922 generic.go:334] "Generic (PLEG): container finished" podID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerID="8a11c92c72f011f0b87b1f41c4fc0313bbdf529cd1fc438ed6cea3c7c37f5fce" exitCode=0 Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.965078 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" event={"ID":"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e","Type":"ContainerDied","Data":"8a11c92c72f011f0b87b1f41c4fc0313bbdf529cd1fc438ed6cea3c7c37f5fce"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.965126 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" event={"ID":"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e","Type":"ContainerStarted","Data":"4e821fe3f0f765edc8677799770547e537cccf511d02423653e9e4fb0bacbe55"} Nov 22 03:11:42 crc kubenswrapper[4922]: I1122 03:11:42.969243 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pcvkr" podStartSLOduration=2.969225673 podStartE2EDuration="2.969225673s" podCreationTimestamp="2025-11-22 03:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:42.95155883 +0000 UTC m=+1138.990080712" watchObservedRunningTime="2025-11-22 03:11:42.969225673 +0000 UTC m=+1139.007747565" Nov 22 03:11:43 crc kubenswrapper[4922]: I1122 03:11:43.326187 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:43 crc kubenswrapper[4922]: I1122 03:11:43.991752 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" event={"ID":"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e","Type":"ContainerStarted","Data":"5e3fe0173c097deab6955e184404673175c9385dacd5472b97e131f0d2081f0b"} Nov 22 03:11:43 crc kubenswrapper[4922]: I1122 03:11:43.992155 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:43 crc kubenswrapper[4922]: I1122 03:11:43.994214 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62f1703-3090-40d6-86e2-3afef486e933","Type":"ContainerStarted","Data":"b682c2f4e87a6945fdd76402a88eacdf79d3b40979a4e1128cb94eeb3a59e683"} Nov 22 03:11:44 crc kubenswrapper[4922]: I1122 03:11:44.019041 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" podStartSLOduration=3.018991588 podStartE2EDuration="3.018991588s" podCreationTimestamp="2025-11-22 03:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:44.010264579 +0000 UTC m=+1140.048786471" watchObservedRunningTime="2025-11-22 03:11:44.018991588 +0000 UTC m=+1140.057513480" Nov 22 03:11:44 crc kubenswrapper[4922]: I1122 03:11:44.775136 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:11:44 crc kubenswrapper[4922]: I1122 03:11:44.786129 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.035386 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b2ab3b2-f030-4140-9b7c-94004fd07915","Type":"ContainerStarted","Data":"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54"} Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.036003 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b2ab3b2-f030-4140-9b7c-94004fd07915","Type":"ContainerStarted","Data":"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1"} Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.035775 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-metadata" containerID="cri-o://7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54" gracePeriod=30 Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.036011 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-log" containerID="cri-o://644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1" gracePeriod=30 Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.037901 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64048346-bcee-40da-af93-b7fbb844c1f9","Type":"ContainerStarted","Data":"2407c19e18d8987997a9fb965adc8f6148e8e8b87da71df2007cc8bea77bdef9"} Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.037991 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="64048346-bcee-40da-af93-b7fbb844c1f9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2407c19e18d8987997a9fb965adc8f6148e8e8b87da71df2007cc8bea77bdef9" gracePeriod=30 Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.040490 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3","Type":"ContainerStarted","Data":"63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3"} Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.040516 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3","Type":"ContainerStarted","Data":"edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd"} Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.044813 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62f1703-3090-40d6-86e2-3afef486e933","Type":"ContainerStarted","Data":"77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad"} Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.055151 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.492188198 podStartE2EDuration="7.055131546s" podCreationTimestamp="2025-11-22 03:11:41 +0000 UTC" firstStartedPulling="2025-11-22 03:11:42.147855924 +0000 UTC m=+1138.186377816" lastFinishedPulling="2025-11-22 03:11:46.710799272 +0000 UTC m=+1142.749321164" observedRunningTime="2025-11-22 03:11:48.054644805 +0000 UTC m=+1144.093166697" watchObservedRunningTime="2025-11-22 03:11:48.055131546 +0000 UTC m=+1144.093653438" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.085203 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.522562037 podStartE2EDuration="7.085187997s" podCreationTimestamp="2025-11-22 03:11:41 +0000 UTC" firstStartedPulling="2025-11-22 03:11:42.148293165 +0000 UTC m=+1138.186815057" lastFinishedPulling="2025-11-22 03:11:46.710919115 +0000 UTC m=+1142.749441017" observedRunningTime="2025-11-22 03:11:48.077750159 +0000 UTC m=+1144.116272051" watchObservedRunningTime="2025-11-22 03:11:48.085187997 +0000 UTC m=+1144.123709889" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.097355 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.676898498 podStartE2EDuration="7.097326999s" podCreationTimestamp="2025-11-22 03:11:41 +0000 UTC" firstStartedPulling="2025-11-22 03:11:43.37081331 +0000 UTC m=+1139.409335202" lastFinishedPulling="2025-11-22 03:11:46.791241811 +0000 UTC m=+1142.829763703" observedRunningTime="2025-11-22 03:11:48.093972078 +0000 UTC m=+1144.132493980" watchObservedRunningTime="2025-11-22 03:11:48.097326999 +0000 UTC m=+1144.135848891" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.112541 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7047889339999998 podStartE2EDuration="7.112523462s" podCreationTimestamp="2025-11-22 03:11:41 +0000 UTC" firstStartedPulling="2025-11-22 03:11:42.312262685 +0000 UTC m=+1138.350784577" lastFinishedPulling="2025-11-22 03:11:46.719997213 +0000 UTC m=+1142.758519105" observedRunningTime="2025-11-22 03:11:48.108918336 +0000 UTC m=+1144.147440248" watchObservedRunningTime="2025-11-22 03:11:48.112523462 +0000 UTC m=+1144.151045354" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.598427 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.774282 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lspg9\" (UniqueName: \"kubernetes.io/projected/2b2ab3b2-f030-4140-9b7c-94004fd07915-kube-api-access-lspg9\") pod \"2b2ab3b2-f030-4140-9b7c-94004fd07915\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.774564 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b2ab3b2-f030-4140-9b7c-94004fd07915-logs\") pod \"2b2ab3b2-f030-4140-9b7c-94004fd07915\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.774707 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-config-data\") pod \"2b2ab3b2-f030-4140-9b7c-94004fd07915\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.774727 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-combined-ca-bundle\") pod \"2b2ab3b2-f030-4140-9b7c-94004fd07915\" (UID: \"2b2ab3b2-f030-4140-9b7c-94004fd07915\") " Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.774866 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b2ab3b2-f030-4140-9b7c-94004fd07915-logs" (OuterVolumeSpecName: "logs") pod "2b2ab3b2-f030-4140-9b7c-94004fd07915" (UID: "2b2ab3b2-f030-4140-9b7c-94004fd07915"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.775939 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b2ab3b2-f030-4140-9b7c-94004fd07915-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.781044 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2ab3b2-f030-4140-9b7c-94004fd07915-kube-api-access-lspg9" (OuterVolumeSpecName: "kube-api-access-lspg9") pod "2b2ab3b2-f030-4140-9b7c-94004fd07915" (UID: "2b2ab3b2-f030-4140-9b7c-94004fd07915"). InnerVolumeSpecName "kube-api-access-lspg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.808353 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-config-data" (OuterVolumeSpecName: "config-data") pod "2b2ab3b2-f030-4140-9b7c-94004fd07915" (UID: "2b2ab3b2-f030-4140-9b7c-94004fd07915"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.809844 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b2ab3b2-f030-4140-9b7c-94004fd07915" (UID: "2b2ab3b2-f030-4140-9b7c-94004fd07915"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.877748 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.877787 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b2ab3b2-f030-4140-9b7c-94004fd07915-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:48 crc kubenswrapper[4922]: I1122 03:11:48.877804 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lspg9\" (UniqueName: \"kubernetes.io/projected/2b2ab3b2-f030-4140-9b7c-94004fd07915-kube-api-access-lspg9\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.071739 4922 generic.go:334] "Generic (PLEG): container finished" podID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerID="7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54" exitCode=0 Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.071775 4922 generic.go:334] "Generic (PLEG): container finished" podID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerID="644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1" exitCode=143 Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.071802 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.071926 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b2ab3b2-f030-4140-9b7c-94004fd07915","Type":"ContainerDied","Data":"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54"} Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.071990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b2ab3b2-f030-4140-9b7c-94004fd07915","Type":"ContainerDied","Data":"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1"} Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.072021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b2ab3b2-f030-4140-9b7c-94004fd07915","Type":"ContainerDied","Data":"9dc5c64ac7eaea9639a1bce21460b9403557b9a4f9a1ea38f5f17bcf4bf078c7"} Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.072065 4922 scope.go:117] "RemoveContainer" containerID="7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.110208 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.112686 4922 scope.go:117] "RemoveContainer" containerID="644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.126730 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.189206 4922 scope.go:117] "RemoveContainer" containerID="7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54" Nov 22 03:11:49 crc kubenswrapper[4922]: E1122 03:11:49.192108 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54\": container with ID starting with 7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54 not found: ID does not exist" containerID="7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.192158 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54"} err="failed to get container status \"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54\": rpc error: code = NotFound desc = could not find container \"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54\": container with ID starting with 7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54 not found: ID does not exist" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.192221 4922 scope.go:117] "RemoveContainer" containerID="644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.192535 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:49 crc kubenswrapper[4922]: E1122 03:11:49.192978 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-log" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.192990 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-log" Nov 22 03:11:49 crc kubenswrapper[4922]: E1122 03:11:49.193087 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-metadata" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.193100 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-metadata" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.193282 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-metadata" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.193311 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" containerName="nova-metadata-log" Nov 22 03:11:49 crc kubenswrapper[4922]: E1122 03:11:49.193886 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1\": container with ID starting with 644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1 not found: ID does not exist" containerID="644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.193925 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1"} err="failed to get container status \"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1\": rpc error: code = NotFound desc = could not find container \"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1\": container with ID starting with 644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1 not found: ID does not exist" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.193947 4922 scope.go:117] "RemoveContainer" containerID="7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.194274 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.195120 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54"} err="failed to get container status \"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54\": rpc error: code = NotFound desc = could not find container \"7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54\": container with ID starting with 7b96846caf77ceedd027c1924a3214103e87993cce202f703353dbae00e03c54 not found: ID does not exist" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.195142 4922 scope.go:117] "RemoveContainer" containerID="644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.197298 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1"} err="failed to get container status \"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1\": rpc error: code = NotFound desc = could not find container \"644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1\": container with ID starting with 644cb7fabd42a18aeb29fab106869a09e692d070a38ff364d3bf06ea5cd50de1 not found: ID does not exist" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.197640 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.197871 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.199815 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.291574 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.291633 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.291699 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c814dd-d5e4-4d19-9843-2092f75e6b1e-logs\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.291718 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg7zk\" (UniqueName: \"kubernetes.io/projected/07c814dd-d5e4-4d19-9843-2092f75e6b1e-kube-api-access-xg7zk\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.291736 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-config-data\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.309252 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2ab3b2-f030-4140-9b7c-94004fd07915" path="/var/lib/kubelet/pods/2b2ab3b2-f030-4140-9b7c-94004fd07915/volumes" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.393819 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c814dd-d5e4-4d19-9843-2092f75e6b1e-logs\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.393896 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg7zk\" (UniqueName: \"kubernetes.io/projected/07c814dd-d5e4-4d19-9843-2092f75e6b1e-kube-api-access-xg7zk\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.393932 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-config-data\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.394476 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c814dd-d5e4-4d19-9843-2092f75e6b1e-logs\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.395083 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.395627 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.397429 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-config-data\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.399305 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.400054 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.409404 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg7zk\" (UniqueName: \"kubernetes.io/projected/07c814dd-d5e4-4d19-9843-2092f75e6b1e-kube-api-access-xg7zk\") pod \"nova-metadata-0\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " pod="openstack/nova-metadata-0" Nov 22 03:11:49 crc kubenswrapper[4922]: I1122 03:11:49.564007 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:50 crc kubenswrapper[4922]: I1122 03:11:50.074581 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:50 crc kubenswrapper[4922]: W1122 03:11:50.078549 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07c814dd_d5e4_4d19_9843_2092f75e6b1e.slice/crio-2dab5dd668b52c0e3c936e04a79aba4ccb944dcaf9213e6b09358530ceb1d06d WatchSource:0}: Error finding container 2dab5dd668b52c0e3c936e04a79aba4ccb944dcaf9213e6b09358530ceb1d06d: Status 404 returned error can't find the container with id 2dab5dd668b52c0e3c936e04a79aba4ccb944dcaf9213e6b09358530ceb1d06d Nov 22 03:11:50 crc kubenswrapper[4922]: I1122 03:11:50.090212 4922 generic.go:334] "Generic (PLEG): container finished" podID="29a6e29f-a80d-4f7d-8d32-c5a18f26da69" containerID="59f0c8464dddcc2b9c2e1d38c8cb1d2d0a5a31fff9e78685924efb86600644f7" exitCode=0 Nov 22 03:11:50 crc kubenswrapper[4922]: I1122 03:11:50.090293 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pcvkr" event={"ID":"29a6e29f-a80d-4f7d-8d32-c5a18f26da69","Type":"ContainerDied","Data":"59f0c8464dddcc2b9c2e1d38c8cb1d2d0a5a31fff9e78685924efb86600644f7"} Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.101624 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07c814dd-d5e4-4d19-9843-2092f75e6b1e","Type":"ContainerStarted","Data":"f973b59b54fbccc79b4d69968a40bf6b87fb64a39fa9280f76f5a8633ceb1dd4"} Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.101983 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07c814dd-d5e4-4d19-9843-2092f75e6b1e","Type":"ContainerStarted","Data":"ad4f798da063bc92cdf57d4c76567a20e5793f3adff4d8277886efd46ff8c2cd"} Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.102001 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07c814dd-d5e4-4d19-9843-2092f75e6b1e","Type":"ContainerStarted","Data":"2dab5dd668b52c0e3c936e04a79aba4ccb944dcaf9213e6b09358530ceb1d06d"} Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.138228 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.1382072 podStartE2EDuration="2.1382072s" podCreationTimestamp="2025-11-22 03:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:51.1202214 +0000 UTC m=+1147.158743302" watchObservedRunningTime="2025-11-22 03:11:51.1382072 +0000 UTC m=+1147.176729092" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.477167 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.552024 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.653680 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-scripts\") pod \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.653728 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-config-data\") pod \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.653820 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-combined-ca-bundle\") pod \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.654612 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rvx2\" (UniqueName: \"kubernetes.io/projected/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-kube-api-access-8rvx2\") pod \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\" (UID: \"29a6e29f-a80d-4f7d-8d32-c5a18f26da69\") " Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.664995 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-scripts" (OuterVolumeSpecName: "scripts") pod "29a6e29f-a80d-4f7d-8d32-c5a18f26da69" (UID: "29a6e29f-a80d-4f7d-8d32-c5a18f26da69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.665027 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-kube-api-access-8rvx2" (OuterVolumeSpecName: "kube-api-access-8rvx2") pod "29a6e29f-a80d-4f7d-8d32-c5a18f26da69" (UID: "29a6e29f-a80d-4f7d-8d32-c5a18f26da69"). InnerVolumeSpecName "kube-api-access-8rvx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.665109 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.665151 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.681484 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-config-data" (OuterVolumeSpecName: "config-data") pod "29a6e29f-a80d-4f7d-8d32-c5a18f26da69" (UID: "29a6e29f-a80d-4f7d-8d32-c5a18f26da69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.690112 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29a6e29f-a80d-4f7d-8d32-c5a18f26da69" (UID: "29a6e29f-a80d-4f7d-8d32-c5a18f26da69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.719055 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.770456 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.770498 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.770512 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.770524 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rvx2\" (UniqueName: \"kubernetes.io/projected/29a6e29f-a80d-4f7d-8d32-c5a18f26da69-kube-api-access-8rvx2\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.783468 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-vnmhp"] Nov 22 03:11:51 crc kubenswrapper[4922]: I1122 03:11:51.783688 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerName="dnsmasq-dns" containerID="cri-o://92888f6953488ada7ba19dff750a370e33f1d2e025ff5385b66595d73bfaea9c" gracePeriod=10 Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.112397 4922 generic.go:334] "Generic (PLEG): container finished" podID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerID="92888f6953488ada7ba19dff750a370e33f1d2e025ff5385b66595d73bfaea9c" exitCode=0 Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.112550 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" event={"ID":"5bea70c1-286c-41fa-af9a-42bb8568af3e","Type":"ContainerDied","Data":"92888f6953488ada7ba19dff750a370e33f1d2e025ff5385b66595d73bfaea9c"} Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.114445 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pcvkr" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.114907 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pcvkr" event={"ID":"29a6e29f-a80d-4f7d-8d32-c5a18f26da69","Type":"ContainerDied","Data":"2d94e83987c3fb50e88f6f7ad9a9ab03aaddb9fc16c6af4d120518ce80374a03"} Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.114934 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d94e83987c3fb50e88f6f7ad9a9ab03aaddb9fc16c6af4d120518ce80374a03" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.290044 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.290941 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-log" containerID="cri-o://edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd" gracePeriod=30 Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.291162 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-api" containerID="cri-o://63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3" gracePeriod=30 Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.298790 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.172:8774/\": EOF" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.298895 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.172:8774/\": EOF" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.314962 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.315216 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b62f1703-3090-40d6-86e2-3afef486e933" containerName="nova-scheduler-scheduler" containerID="cri-o://77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad" gracePeriod=30 Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.327761 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.327996 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.402060 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-sb\") pod \"5bea70c1-286c-41fa-af9a-42bb8568af3e\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.402125 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-nb\") pod \"5bea70c1-286c-41fa-af9a-42bb8568af3e\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.402207 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-config\") pod \"5bea70c1-286c-41fa-af9a-42bb8568af3e\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.402234 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-dns-svc\") pod \"5bea70c1-286c-41fa-af9a-42bb8568af3e\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.402350 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km5dn\" (UniqueName: \"kubernetes.io/projected/5bea70c1-286c-41fa-af9a-42bb8568af3e-kube-api-access-km5dn\") pod \"5bea70c1-286c-41fa-af9a-42bb8568af3e\" (UID: \"5bea70c1-286c-41fa-af9a-42bb8568af3e\") " Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.407000 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bea70c1-286c-41fa-af9a-42bb8568af3e-kube-api-access-km5dn" (OuterVolumeSpecName: "kube-api-access-km5dn") pod "5bea70c1-286c-41fa-af9a-42bb8568af3e" (UID: "5bea70c1-286c-41fa-af9a-42bb8568af3e"). InnerVolumeSpecName "kube-api-access-km5dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.459378 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bea70c1-286c-41fa-af9a-42bb8568af3e" (UID: "5bea70c1-286c-41fa-af9a-42bb8568af3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.472437 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-config" (OuterVolumeSpecName: "config") pod "5bea70c1-286c-41fa-af9a-42bb8568af3e" (UID: "5bea70c1-286c-41fa-af9a-42bb8568af3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.476320 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bea70c1-286c-41fa-af9a-42bb8568af3e" (UID: "5bea70c1-286c-41fa-af9a-42bb8568af3e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.491188 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bea70c1-286c-41fa-af9a-42bb8568af3e" (UID: "5bea70c1-286c-41fa-af9a-42bb8568af3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.505974 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km5dn\" (UniqueName: \"kubernetes.io/projected/5bea70c1-286c-41fa-af9a-42bb8568af3e-kube-api-access-km5dn\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.506006 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.506016 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.506027 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.506038 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bea70c1-286c-41fa-af9a-42bb8568af3e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:52 crc kubenswrapper[4922]: I1122 03:11:52.857217 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.127613 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" event={"ID":"5bea70c1-286c-41fa-af9a-42bb8568af3e","Type":"ContainerDied","Data":"f26e5d274f05a6b149510f200397fabe8ebf0c366b78795ac7db98f68a7de1b2"} Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.127677 4922 scope.go:117] "RemoveContainer" containerID="92888f6953488ada7ba19dff750a370e33f1d2e025ff5385b66595d73bfaea9c" Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.127715 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-vnmhp" Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.134185 4922 generic.go:334] "Generic (PLEG): container finished" podID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerID="edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd" exitCode=143 Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.134253 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3","Type":"ContainerDied","Data":"edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd"} Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.134362 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-log" containerID="cri-o://ad4f798da063bc92cdf57d4c76567a20e5793f3adff4d8277886efd46ff8c2cd" gracePeriod=30 Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.134785 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-metadata" containerID="cri-o://f973b59b54fbccc79b4d69968a40bf6b87fb64a39fa9280f76f5a8633ceb1dd4" gracePeriod=30 Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.166840 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-vnmhp"] Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.178252 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-vnmhp"] Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.288398 4922 scope.go:117] "RemoveContainer" containerID="680949f8dfc31583d928406ee647a731c9fdb8d61efe1a880064c100f2ad4db9" Nov 22 03:11:53 crc kubenswrapper[4922]: I1122 03:11:53.449074 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" path="/var/lib/kubelet/pods/5bea70c1-286c-41fa-af9a-42bb8568af3e/volumes" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.176270 4922 generic.go:334] "Generic (PLEG): container finished" podID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerID="f973b59b54fbccc79b4d69968a40bf6b87fb64a39fa9280f76f5a8633ceb1dd4" exitCode=0 Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.176658 4922 generic.go:334] "Generic (PLEG): container finished" podID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerID="ad4f798da063bc92cdf57d4c76567a20e5793f3adff4d8277886efd46ff8c2cd" exitCode=143 Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.176707 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07c814dd-d5e4-4d19-9843-2092f75e6b1e","Type":"ContainerDied","Data":"f973b59b54fbccc79b4d69968a40bf6b87fb64a39fa9280f76f5a8633ceb1dd4"} Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.176738 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07c814dd-d5e4-4d19-9843-2092f75e6b1e","Type":"ContainerDied","Data":"ad4f798da063bc92cdf57d4c76567a20e5793f3adff4d8277886efd46ff8c2cd"} Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.359240 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461294 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c814dd-d5e4-4d19-9843-2092f75e6b1e-logs\") pod \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461347 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg7zk\" (UniqueName: \"kubernetes.io/projected/07c814dd-d5e4-4d19-9843-2092f75e6b1e-kube-api-access-xg7zk\") pod \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461370 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-nova-metadata-tls-certs\") pod \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461417 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-config-data\") pod \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461519 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-combined-ca-bundle\") pod \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\" (UID: \"07c814dd-d5e4-4d19-9843-2092f75e6b1e\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461676 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c814dd-d5e4-4d19-9843-2092f75e6b1e-logs" (OuterVolumeSpecName: "logs") pod "07c814dd-d5e4-4d19-9843-2092f75e6b1e" (UID: "07c814dd-d5e4-4d19-9843-2092f75e6b1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.461872 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c814dd-d5e4-4d19-9843-2092f75e6b1e-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.468250 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c814dd-d5e4-4d19-9843-2092f75e6b1e-kube-api-access-xg7zk" (OuterVolumeSpecName: "kube-api-access-xg7zk") pod "07c814dd-d5e4-4d19-9843-2092f75e6b1e" (UID: "07c814dd-d5e4-4d19-9843-2092f75e6b1e"). InnerVolumeSpecName "kube-api-access-xg7zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.494612 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07c814dd-d5e4-4d19-9843-2092f75e6b1e" (UID: "07c814dd-d5e4-4d19-9843-2092f75e6b1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.495096 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-config-data" (OuterVolumeSpecName: "config-data") pod "07c814dd-d5e4-4d19-9843-2092f75e6b1e" (UID: "07c814dd-d5e4-4d19-9843-2092f75e6b1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.529047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "07c814dd-d5e4-4d19-9843-2092f75e6b1e" (UID: "07c814dd-d5e4-4d19-9843-2092f75e6b1e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.556725 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.564161 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg7zk\" (UniqueName: \"kubernetes.io/projected/07c814dd-d5e4-4d19-9843-2092f75e6b1e-kube-api-access-xg7zk\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.564209 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.564233 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.564253 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c814dd-d5e4-4d19-9843-2092f75e6b1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.665892 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jl4m\" (UniqueName: \"kubernetes.io/projected/b62f1703-3090-40d6-86e2-3afef486e933-kube-api-access-2jl4m\") pod \"b62f1703-3090-40d6-86e2-3afef486e933\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.665949 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data\") pod \"b62f1703-3090-40d6-86e2-3afef486e933\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.666032 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-combined-ca-bundle\") pod \"b62f1703-3090-40d6-86e2-3afef486e933\" (UID: \"b62f1703-3090-40d6-86e2-3afef486e933\") " Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.682326 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62f1703-3090-40d6-86e2-3afef486e933-kube-api-access-2jl4m" (OuterVolumeSpecName: "kube-api-access-2jl4m") pod "b62f1703-3090-40d6-86e2-3afef486e933" (UID: "b62f1703-3090-40d6-86e2-3afef486e933"). InnerVolumeSpecName "kube-api-access-2jl4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.689644 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data" (OuterVolumeSpecName: "config-data") pod "b62f1703-3090-40d6-86e2-3afef486e933" (UID: "b62f1703-3090-40d6-86e2-3afef486e933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.697315 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b62f1703-3090-40d6-86e2-3afef486e933" (UID: "b62f1703-3090-40d6-86e2-3afef486e933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.768103 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jl4m\" (UniqueName: \"kubernetes.io/projected/b62f1703-3090-40d6-86e2-3afef486e933-kube-api-access-2jl4m\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.768139 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:54 crc kubenswrapper[4922]: I1122 03:11:54.768150 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b62f1703-3090-40d6-86e2-3afef486e933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.203975 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07c814dd-d5e4-4d19-9843-2092f75e6b1e","Type":"ContainerDied","Data":"2dab5dd668b52c0e3c936e04a79aba4ccb944dcaf9213e6b09358530ceb1d06d"} Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.204089 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.204660 4922 scope.go:117] "RemoveContainer" containerID="f973b59b54fbccc79b4d69968a40bf6b87fb64a39fa9280f76f5a8633ceb1dd4" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.206804 4922 generic.go:334] "Generic (PLEG): container finished" podID="b62f1703-3090-40d6-86e2-3afef486e933" containerID="77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad" exitCode=0 Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.206868 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62f1703-3090-40d6-86e2-3afef486e933","Type":"ContainerDied","Data":"77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad"} Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.206911 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b62f1703-3090-40d6-86e2-3afef486e933","Type":"ContainerDied","Data":"b682c2f4e87a6945fdd76402a88eacdf79d3b40979a4e1128cb94eeb3a59e683"} Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.206874 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.277076 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.292310 4922 scope.go:117] "RemoveContainer" containerID="ad4f798da063bc92cdf57d4c76567a20e5793f3adff4d8277886efd46ff8c2cd" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.300482 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.335024 4922 scope.go:117] "RemoveContainer" containerID="77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.341418 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b62f1703-3090-40d6-86e2-3afef486e933" path="/var/lib/kubelet/pods/b62f1703-3090-40d6-86e2-3afef486e933/volumes" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.343960 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344005 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344025 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.344357 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerName="dnsmasq-dns" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344377 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerName="dnsmasq-dns" Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.344396 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerName="init" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344405 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerName="init" Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.344429 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-log" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344438 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-log" Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.344451 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62f1703-3090-40d6-86e2-3afef486e933" containerName="nova-scheduler-scheduler" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344460 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62f1703-3090-40d6-86e2-3afef486e933" containerName="nova-scheduler-scheduler" Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.344484 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-metadata" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344493 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-metadata" Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.344507 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a6e29f-a80d-4f7d-8d32-c5a18f26da69" containerName="nova-manage" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344515 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a6e29f-a80d-4f7d-8d32-c5a18f26da69" containerName="nova-manage" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344722 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62f1703-3090-40d6-86e2-3afef486e933" containerName="nova-scheduler-scheduler" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344750 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bea70c1-286c-41fa-af9a-42bb8568af3e" containerName="dnsmasq-dns" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344768 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-metadata" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344784 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" containerName="nova-metadata-log" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.344801 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a6e29f-a80d-4f7d-8d32-c5a18f26da69" containerName="nova-manage" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.345515 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.345646 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.348349 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.348959 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.351176 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.351414 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.352155 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.365593 4922 scope.go:117] "RemoveContainer" containerID="77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.366031 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:55 crc kubenswrapper[4922]: E1122 03:11:55.366386 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad\": container with ID starting with 77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad not found: ID does not exist" containerID="77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.366464 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad"} err="failed to get container status \"77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad\": rpc error: code = NotFound desc = could not find container \"77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad\": container with ID starting with 77a7f2f5736b7d0ccebc7d4fca0df6139a2128545d7fbb3c45272e5c2be17aad not found: ID does not exist" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.380641 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-config-data\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.380770 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.380823 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlvf\" (UniqueName: \"kubernetes.io/projected/696c4507-7b7c-4b17-8428-0ade9ffb17ed-kube-api-access-4vlvf\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.381951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c4507-7b7c-4b17-8428-0ade9ffb17ed-logs\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.382132 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.382195 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-config-data\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.382253 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.382314 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcb5l\" (UniqueName: \"kubernetes.io/projected/361c95c5-7eba-4a1a-9341-4e4da5eb8848-kube-api-access-hcb5l\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483531 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcb5l\" (UniqueName: \"kubernetes.io/projected/361c95c5-7eba-4a1a-9341-4e4da5eb8848-kube-api-access-hcb5l\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483636 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-config-data\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483740 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlvf\" (UniqueName: \"kubernetes.io/projected/696c4507-7b7c-4b17-8428-0ade9ffb17ed-kube-api-access-4vlvf\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483766 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c4507-7b7c-4b17-8428-0ade9ffb17ed-logs\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483827 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.483875 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-config-data\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.486968 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c4507-7b7c-4b17-8428-0ade9ffb17ed-logs\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.487438 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-config-data\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.492188 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.495473 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.506430 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-config-data\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.506938 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.515225 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcb5l\" (UniqueName: \"kubernetes.io/projected/361c95c5-7eba-4a1a-9341-4e4da5eb8848-kube-api-access-hcb5l\") pod \"nova-scheduler-0\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " pod="openstack/nova-scheduler-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.515288 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlvf\" (UniqueName: \"kubernetes.io/projected/696c4507-7b7c-4b17-8428-0ade9ffb17ed-kube-api-access-4vlvf\") pod \"nova-metadata-0\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.678582 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:11:55 crc kubenswrapper[4922]: I1122 03:11:55.681801 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:11:56 crc kubenswrapper[4922]: W1122 03:11:56.178640 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod361c95c5_7eba_4a1a_9341_4e4da5eb8848.slice/crio-d463e2736be649724d39b1fb715dbfc729a5b241390c1ee4030c5fb8c42e83d6 WatchSource:0}: Error finding container d463e2736be649724d39b1fb715dbfc729a5b241390c1ee4030c5fb8c42e83d6: Status 404 returned error can't find the container with id d463e2736be649724d39b1fb715dbfc729a5b241390c1ee4030c5fb8c42e83d6 Nov 22 03:11:56 crc kubenswrapper[4922]: I1122 03:11:56.180820 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:11:56 crc kubenswrapper[4922]: I1122 03:11:56.224453 4922 generic.go:334] "Generic (PLEG): container finished" podID="255e57e6-0d59-4288-9070-373e6ce3d77c" containerID="a51596700e921c5bf64f9df1f6bec6eaf8aa3514a2ec071a771ec3ee7801dd91" exitCode=0 Nov 22 03:11:56 crc kubenswrapper[4922]: I1122 03:11:56.224515 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" event={"ID":"255e57e6-0d59-4288-9070-373e6ce3d77c","Type":"ContainerDied","Data":"a51596700e921c5bf64f9df1f6bec6eaf8aa3514a2ec071a771ec3ee7801dd91"} Nov 22 03:11:56 crc kubenswrapper[4922]: I1122 03:11:56.235170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"361c95c5-7eba-4a1a-9341-4e4da5eb8848","Type":"ContainerStarted","Data":"d463e2736be649724d39b1fb715dbfc729a5b241390c1ee4030c5fb8c42e83d6"} Nov 22 03:11:56 crc kubenswrapper[4922]: I1122 03:11:56.245532 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:11:56 crc kubenswrapper[4922]: W1122 03:11:56.247150 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696c4507_7b7c_4b17_8428_0ade9ffb17ed.slice/crio-dbf589d0852d4e8495ae2976c778d12088adcac341536f267173d798e7dbf815 WatchSource:0}: Error finding container dbf589d0852d4e8495ae2976c778d12088adcac341536f267173d798e7dbf815: Status 404 returned error can't find the container with id dbf589d0852d4e8495ae2976c778d12088adcac341536f267173d798e7dbf815 Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.261876 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"696c4507-7b7c-4b17-8428-0ade9ffb17ed","Type":"ContainerStarted","Data":"d77fbf6725f3eaae7d31fc9019b32533f293e1dd190defb5563c01a36d37687c"} Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.264105 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"696c4507-7b7c-4b17-8428-0ade9ffb17ed","Type":"ContainerStarted","Data":"c8dc6ebdde5299e8b9ae6461cb7344bc7a2f34f328a5df33c4d13432931e9ed5"} Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.264152 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"696c4507-7b7c-4b17-8428-0ade9ffb17ed","Type":"ContainerStarted","Data":"dbf589d0852d4e8495ae2976c778d12088adcac341536f267173d798e7dbf815"} Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.269282 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"361c95c5-7eba-4a1a-9341-4e4da5eb8848","Type":"ContainerStarted","Data":"ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7"} Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.293474 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.293441207 podStartE2EDuration="2.293441207s" podCreationTimestamp="2025-11-22 03:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:57.2893831 +0000 UTC m=+1153.327905022" watchObservedRunningTime="2025-11-22 03:11:57.293441207 +0000 UTC m=+1153.331963139" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.320827 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.320780212 podStartE2EDuration="2.320780212s" podCreationTimestamp="2025-11-22 03:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:11:57.314180893 +0000 UTC m=+1153.352702795" watchObservedRunningTime="2025-11-22 03:11:57.320780212 +0000 UTC m=+1153.359302114" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.321424 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c814dd-d5e4-4d19-9843-2092f75e6b1e" path="/var/lib/kubelet/pods/07c814dd-d5e4-4d19-9843-2092f75e6b1e/volumes" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.671011 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.733181 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-scripts\") pod \"255e57e6-0d59-4288-9070-373e6ce3d77c\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.733364 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-combined-ca-bundle\") pod \"255e57e6-0d59-4288-9070-373e6ce3d77c\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.733520 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data\") pod \"255e57e6-0d59-4288-9070-373e6ce3d77c\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.733589 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn9hq\" (UniqueName: \"kubernetes.io/projected/255e57e6-0d59-4288-9070-373e6ce3d77c-kube-api-access-gn9hq\") pod \"255e57e6-0d59-4288-9070-373e6ce3d77c\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.740247 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-scripts" (OuterVolumeSpecName: "scripts") pod "255e57e6-0d59-4288-9070-373e6ce3d77c" (UID: "255e57e6-0d59-4288-9070-373e6ce3d77c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.740904 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255e57e6-0d59-4288-9070-373e6ce3d77c-kube-api-access-gn9hq" (OuterVolumeSpecName: "kube-api-access-gn9hq") pod "255e57e6-0d59-4288-9070-373e6ce3d77c" (UID: "255e57e6-0d59-4288-9070-373e6ce3d77c"). InnerVolumeSpecName "kube-api-access-gn9hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:57 crc kubenswrapper[4922]: E1122 03:11:57.774677 4922 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data podName:255e57e6-0d59-4288-9070-373e6ce3d77c nodeName:}" failed. No retries permitted until 2025-11-22 03:11:58.274645041 +0000 UTC m=+1154.313166943 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data") pod "255e57e6-0d59-4288-9070-373e6ce3d77c" (UID: "255e57e6-0d59-4288-9070-373e6ce3d77c") : error deleting /var/lib/kubelet/pods/255e57e6-0d59-4288-9070-373e6ce3d77c/volume-subpaths: remove /var/lib/kubelet/pods/255e57e6-0d59-4288-9070-373e6ce3d77c/volume-subpaths: no such file or directory Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.778281 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "255e57e6-0d59-4288-9070-373e6ce3d77c" (UID: "255e57e6-0d59-4288-9070-373e6ce3d77c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.843839 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.843891 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:57 crc kubenswrapper[4922]: I1122 03:11:57.843905 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn9hq\" (UniqueName: \"kubernetes.io/projected/255e57e6-0d59-4288-9070-373e6ce3d77c-kube-api-access-gn9hq\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.079417 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.170496 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.251370 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6rgx\" (UniqueName: \"kubernetes.io/projected/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-kube-api-access-r6rgx\") pod \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.251551 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-config-data\") pod \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.251598 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-combined-ca-bundle\") pod \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.251668 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-logs\") pod \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\" (UID: \"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3\") " Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.252492 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-logs" (OuterVolumeSpecName: "logs") pod "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" (UID: "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.271797 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-kube-api-access-r6rgx" (OuterVolumeSpecName: "kube-api-access-r6rgx") pod "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" (UID: "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3"). InnerVolumeSpecName "kube-api-access-r6rgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.308131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" event={"ID":"255e57e6-0d59-4288-9070-373e6ce3d77c","Type":"ContainerDied","Data":"f96937392a675156ff1816241ca0ad89c3667c2b434cb27996d53a13a56f6630"} Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.308170 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f96937392a675156ff1816241ca0ad89c3667c2b434cb27996d53a13a56f6630" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.308233 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-q9xlg" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.311175 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" (UID: "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.348150 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-config-data" (OuterVolumeSpecName: "config-data") pod "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" (UID: "c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.351297 4922 generic.go:334] "Generic (PLEG): container finished" podID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerID="63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3" exitCode=0 Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.352114 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.356619 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3","Type":"ContainerDied","Data":"63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3"} Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.356666 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3","Type":"ContainerDied","Data":"613ceb0774fe10d43cdb1db486269d4b0f2d1b2b0d125c9023f8ccac07985142"} Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.356686 4922 scope.go:117] "RemoveContainer" containerID="63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.357264 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data\") pod \"255e57e6-0d59-4288-9070-373e6ce3d77c\" (UID: \"255e57e6-0d59-4288-9070-373e6ce3d77c\") " Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.357922 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6rgx\" (UniqueName: \"kubernetes.io/projected/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-kube-api-access-r6rgx\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.357951 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.357965 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.357979 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.369913 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 03:11:58 crc kubenswrapper[4922]: E1122 03:11:58.370278 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-api" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.370293 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-api" Nov 22 03:11:58 crc kubenswrapper[4922]: E1122 03:11:58.370307 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-log" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.370315 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-log" Nov 22 03:11:58 crc kubenswrapper[4922]: E1122 03:11:58.370351 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255e57e6-0d59-4288-9070-373e6ce3d77c" containerName="nova-cell1-conductor-db-sync" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.370357 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="255e57e6-0d59-4288-9070-373e6ce3d77c" containerName="nova-cell1-conductor-db-sync" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.370511 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="255e57e6-0d59-4288-9070-373e6ce3d77c" containerName="nova-cell1-conductor-db-sync" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.370525 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-api" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.370536 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" containerName="nova-api-log" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.371127 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.373830 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data" (OuterVolumeSpecName: "config-data") pod "255e57e6-0d59-4288-9070-373e6ce3d77c" (UID: "255e57e6-0d59-4288-9070-373e6ce3d77c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.388784 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.423579 4922 scope.go:117] "RemoveContainer" containerID="edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.437932 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.451384 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.460128 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxqwv\" (UniqueName: \"kubernetes.io/projected/18b44985-01b3-4806-9cc0-bec502d417e4-kube-api-access-lxqwv\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.460465 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b44985-01b3-4806-9cc0-bec502d417e4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.460496 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b44985-01b3-4806-9cc0-bec502d417e4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.460573 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255e57e6-0d59-4288-9070-373e6ce3d77c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.466787 4922 scope.go:117] "RemoveContainer" containerID="63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3" Nov 22 03:11:58 crc kubenswrapper[4922]: E1122 03:11:58.467332 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3\": container with ID starting with 63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3 not found: ID does not exist" containerID="63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.467372 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3"} err="failed to get container status \"63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3\": rpc error: code = NotFound desc = could not find container \"63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3\": container with ID starting with 63f36b09488e80a426974c06f4d75c5c4350bdf9ba2db9cae6d2f0cd428e35b3 not found: ID does not exist" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.467403 4922 scope.go:117] "RemoveContainer" containerID="edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd" Nov 22 03:11:58 crc kubenswrapper[4922]: E1122 03:11:58.467793 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd\": container with ID starting with edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd not found: ID does not exist" containerID="edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.467825 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd"} err="failed to get container status \"edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd\": rpc error: code = NotFound desc = could not find container \"edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd\": container with ID starting with edf1f45c4c7a39d56c112459a60be6eb8a9405e7a13c239d23e58e2a92d6b9dd not found: ID does not exist" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.469570 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.471237 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.475202 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.478678 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.561995 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-config-data\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.562075 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b44985-01b3-4806-9cc0-bec502d417e4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.562109 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b44985-01b3-4806-9cc0-bec502d417e4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.562154 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.562199 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490ccaff-a6f3-4e90-853f-74530b743392-logs\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.562296 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnghc\" (UniqueName: \"kubernetes.io/projected/490ccaff-a6f3-4e90-853f-74530b743392-kube-api-access-pnghc\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.562366 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxqwv\" (UniqueName: \"kubernetes.io/projected/18b44985-01b3-4806-9cc0-bec502d417e4-kube-api-access-lxqwv\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.566514 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b44985-01b3-4806-9cc0-bec502d417e4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.566665 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18b44985-01b3-4806-9cc0-bec502d417e4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.579479 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxqwv\" (UniqueName: \"kubernetes.io/projected/18b44985-01b3-4806-9cc0-bec502d417e4-kube-api-access-lxqwv\") pod \"nova-cell1-conductor-0\" (UID: \"18b44985-01b3-4806-9cc0-bec502d417e4\") " pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.663373 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-config-data\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.663606 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.663697 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490ccaff-a6f3-4e90-853f-74530b743392-logs\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.663820 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnghc\" (UniqueName: \"kubernetes.io/projected/490ccaff-a6f3-4e90-853f-74530b743392-kube-api-access-pnghc\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.664122 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490ccaff-a6f3-4e90-853f-74530b743392-logs\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.667758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.668290 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-config-data\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.680239 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnghc\" (UniqueName: \"kubernetes.io/projected/490ccaff-a6f3-4e90-853f-74530b743392-kube-api-access-pnghc\") pod \"nova-api-0\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " pod="openstack/nova-api-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.725253 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 03:11:58 crc kubenswrapper[4922]: I1122 03:11:58.795982 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:11:59 crc kubenswrapper[4922]: I1122 03:11:59.188303 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 03:11:59 crc kubenswrapper[4922]: W1122 03:11:59.190569 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b44985_01b3_4806_9cc0_bec502d417e4.slice/crio-d48cd923c81381bcc906e9f89e33d27d5782e15bf9a7cc3de14dcc3331d277e4 WatchSource:0}: Error finding container d48cd923c81381bcc906e9f89e33d27d5782e15bf9a7cc3de14dcc3331d277e4: Status 404 returned error can't find the container with id d48cd923c81381bcc906e9f89e33d27d5782e15bf9a7cc3de14dcc3331d277e4 Nov 22 03:11:59 crc kubenswrapper[4922]: I1122 03:11:59.278367 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:11:59 crc kubenswrapper[4922]: I1122 03:11:59.312794 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3" path="/var/lib/kubelet/pods/c778b5e0-f576-49cc-b9fb-a21b4ce0e2c3/volumes" Nov 22 03:11:59 crc kubenswrapper[4922]: I1122 03:11:59.359668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"490ccaff-a6f3-4e90-853f-74530b743392","Type":"ContainerStarted","Data":"51bdc21846a8915677eba898142c7324f0492e87a54eca721bc964748d1b45b7"} Nov 22 03:11:59 crc kubenswrapper[4922]: I1122 03:11:59.361069 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"18b44985-01b3-4806-9cc0-bec502d417e4","Type":"ContainerStarted","Data":"d48cd923c81381bcc906e9f89e33d27d5782e15bf9a7cc3de14dcc3331d277e4"} Nov 22 03:12:00 crc kubenswrapper[4922]: I1122 03:12:00.345120 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:12:00 crc kubenswrapper[4922]: I1122 03:12:00.345391 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" containerName="kube-state-metrics" containerID="cri-o://5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e" gracePeriod=30 Nov 22 03:12:00 crc kubenswrapper[4922]: I1122 03:12:00.371823 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"18b44985-01b3-4806-9cc0-bec502d417e4","Type":"ContainerStarted","Data":"29068ab7b020f7bfb6a70eab9ba7df2e389df45b9d7d27d34d34022825d8cf0c"} Nov 22 03:12:00 crc kubenswrapper[4922]: I1122 03:12:00.679668 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:00 crc kubenswrapper[4922]: I1122 03:12:00.680135 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:00 crc kubenswrapper[4922]: I1122 03:12:00.682765 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.330443 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.330942 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-central-agent" containerID="cri-o://35bc425ba87ca523eb68fd449264576ca529b7063a2662d0cceea92c76b0a9d9" gracePeriod=30 Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.331065 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="proxy-httpd" containerID="cri-o://c01af40e744cf71456441bd5823419e79045d9e7d1127d39435a58abece52700" gracePeriod=30 Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.331103 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="sg-core" containerID="cri-o://2e5495a2cec9e3011bb6865e8f88624777bd0b56e6edf7452eab3590d846420b" gracePeriod=30 Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.331135 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-notification-agent" containerID="cri-o://788589c18f9d24d3d0050d80f0262085550508caa9bdfde6126b1dea8af83c4c" gracePeriod=30 Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.356824 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.381060 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"490ccaff-a6f3-4e90-853f-74530b743392","Type":"ContainerStarted","Data":"6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503"} Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.381101 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"490ccaff-a6f3-4e90-853f-74530b743392","Type":"ContainerStarted","Data":"be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60"} Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.382761 4922 generic.go:334] "Generic (PLEG): container finished" podID="1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" containerID="5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e" exitCode=2 Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.383314 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.383454 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b","Type":"ContainerDied","Data":"5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e"} Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.383477 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b","Type":"ContainerDied","Data":"2b88195d9184ae974eeed55c465caf1f27c2b46473d626a13a5dbd1c1b28e66d"} Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.383492 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.383510 4922 scope.go:117] "RemoveContainer" containerID="5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.408056 4922 scope.go:117] "RemoveContainer" containerID="5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.409210 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.409191725 podStartE2EDuration="3.409191725s" podCreationTimestamp="2025-11-22 03:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:01.401606973 +0000 UTC m=+1157.440128885" watchObservedRunningTime="2025-11-22 03:12:01.409191725 +0000 UTC m=+1157.447713617" Nov 22 03:12:01 crc kubenswrapper[4922]: E1122 03:12:01.410227 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e\": container with ID starting with 5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e not found: ID does not exist" containerID="5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.410280 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e"} err="failed to get container status \"5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e\": rpc error: code = NotFound desc = could not find container \"5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e\": container with ID starting with 5cd10f8f314ce908c68e99a0f82748187d57575bebad57e630ae6b1b18ff8c4e not found: ID does not exist" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.425189 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.425150708 podStartE2EDuration="3.425150708s" podCreationTimestamp="2025-11-22 03:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:01.421555741 +0000 UTC m=+1157.460077653" watchObservedRunningTime="2025-11-22 03:12:01.425150708 +0000 UTC m=+1157.463672600" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.515794 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwrr6\" (UniqueName: \"kubernetes.io/projected/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b-kube-api-access-nwrr6\") pod \"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b\" (UID: \"1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b\") " Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.522609 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b-kube-api-access-nwrr6" (OuterVolumeSpecName: "kube-api-access-nwrr6") pod "1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" (UID: "1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b"). InnerVolumeSpecName "kube-api-access-nwrr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.618436 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwrr6\" (UniqueName: \"kubernetes.io/projected/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b-kube-api-access-nwrr6\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.717157 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.727872 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.736056 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:12:01 crc kubenswrapper[4922]: E1122 03:12:01.736584 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" containerName="kube-state-metrics" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.736607 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" containerName="kube-state-metrics" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.736872 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" containerName="kube-state-metrics" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.737669 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.739427 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.740454 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.750267 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.821963 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.822055 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9qx\" (UniqueName: \"kubernetes.io/projected/a1e5684d-314d-4ad1-940c-96696265b505-kube-api-access-rt9qx\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.822111 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.822130 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.924305 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.924894 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9qx\" (UniqueName: \"kubernetes.io/projected/a1e5684d-314d-4ad1-940c-96696265b505-kube-api-access-rt9qx\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.925277 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.925690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.931010 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.931091 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.946538 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a1e5684d-314d-4ad1-940c-96696265b505-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:01 crc kubenswrapper[4922]: I1122 03:12:01.952345 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9qx\" (UniqueName: \"kubernetes.io/projected/a1e5684d-314d-4ad1-940c-96696265b505-kube-api-access-rt9qx\") pod \"kube-state-metrics-0\" (UID: \"a1e5684d-314d-4ad1-940c-96696265b505\") " pod="openstack/kube-state-metrics-0" Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.062614 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.413754 4922 generic.go:334] "Generic (PLEG): container finished" podID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerID="c01af40e744cf71456441bd5823419e79045d9e7d1127d39435a58abece52700" exitCode=0 Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.414960 4922 generic.go:334] "Generic (PLEG): container finished" podID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerID="2e5495a2cec9e3011bb6865e8f88624777bd0b56e6edf7452eab3590d846420b" exitCode=2 Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.414983 4922 generic.go:334] "Generic (PLEG): container finished" podID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerID="35bc425ba87ca523eb68fd449264576ca529b7063a2662d0cceea92c76b0a9d9" exitCode=0 Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.413859 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerDied","Data":"c01af40e744cf71456441bd5823419e79045d9e7d1127d39435a58abece52700"} Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.415065 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerDied","Data":"2e5495a2cec9e3011bb6865e8f88624777bd0b56e6edf7452eab3590d846420b"} Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.415103 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerDied","Data":"35bc425ba87ca523eb68fd449264576ca529b7063a2662d0cceea92c76b0a9d9"} Nov 22 03:12:02 crc kubenswrapper[4922]: I1122 03:12:02.606530 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 03:12:03 crc kubenswrapper[4922]: I1122 03:12:03.314487 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b" path="/var/lib/kubelet/pods/1ee5b2bc-9aaf-4bb7-b5b2-f4f14c5e004b/volumes" Nov 22 03:12:03 crc kubenswrapper[4922]: I1122 03:12:03.424887 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a1e5684d-314d-4ad1-940c-96696265b505","Type":"ContainerStarted","Data":"d863e092600398a7ba8f9dea88e61f6ddd0c02d38e9c0c0869f17662398deca1"} Nov 22 03:12:04 crc kubenswrapper[4922]: I1122 03:12:04.437953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a1e5684d-314d-4ad1-940c-96696265b505","Type":"ContainerStarted","Data":"718d46b1f497cec7e0054f4c3c5ae08a889930b4db93636a38227b375f7af59a"} Nov 22 03:12:04 crc kubenswrapper[4922]: I1122 03:12:04.438934 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 03:12:04 crc kubenswrapper[4922]: I1122 03:12:04.453191 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.970333949 podStartE2EDuration="3.453163934s" podCreationTimestamp="2025-11-22 03:12:01 +0000 UTC" firstStartedPulling="2025-11-22 03:12:02.615639682 +0000 UTC m=+1158.654161584" lastFinishedPulling="2025-11-22 03:12:03.098469677 +0000 UTC m=+1159.136991569" observedRunningTime="2025-11-22 03:12:04.452200551 +0000 UTC m=+1160.490722483" watchObservedRunningTime="2025-11-22 03:12:04.453163934 +0000 UTC m=+1160.491685836" Nov 22 03:12:05 crc kubenswrapper[4922]: I1122 03:12:05.679100 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:12:05 crc kubenswrapper[4922]: I1122 03:12:05.679484 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:12:05 crc kubenswrapper[4922]: I1122 03:12:05.682175 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 03:12:05 crc kubenswrapper[4922]: I1122 03:12:05.720785 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.464083 4922 generic.go:334] "Generic (PLEG): container finished" podID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerID="788589c18f9d24d3d0050d80f0262085550508caa9bdfde6126b1dea8af83c4c" exitCode=0 Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.464299 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerDied","Data":"788589c18f9d24d3d0050d80f0262085550508caa9bdfde6126b1dea8af83c4c"} Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.523474 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.693058 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.693087 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.722244 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922497 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6tlc\" (UniqueName: \"kubernetes.io/projected/07379147-ac76-4a20-bcdc-dc13fceaaabc-kube-api-access-w6tlc\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922618 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-combined-ca-bundle\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-log-httpd\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922742 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-config-data\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-scripts\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922831 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-run-httpd\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.922874 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-sg-core-conf-yaml\") pod \"07379147-ac76-4a20-bcdc-dc13fceaaabc\" (UID: \"07379147-ac76-4a20-bcdc-dc13fceaaabc\") " Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.925108 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.925382 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.929076 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07379147-ac76-4a20-bcdc-dc13fceaaabc-kube-api-access-w6tlc" (OuterVolumeSpecName: "kube-api-access-w6tlc") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "kube-api-access-w6tlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.929098 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-scripts" (OuterVolumeSpecName: "scripts") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:06 crc kubenswrapper[4922]: I1122 03:12:06.962653 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.035284 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.035500 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.035571 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.035634 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6tlc\" (UniqueName: \"kubernetes.io/projected/07379147-ac76-4a20-bcdc-dc13fceaaabc-kube-api-access-w6tlc\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.035714 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07379147-ac76-4a20-bcdc-dc13fceaaabc-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.037011 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.093595 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-config-data" (OuterVolumeSpecName: "config-data") pod "07379147-ac76-4a20-bcdc-dc13fceaaabc" (UID: "07379147-ac76-4a20-bcdc-dc13fceaaabc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.137216 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.137260 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07379147-ac76-4a20-bcdc-dc13fceaaabc-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.477288 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07379147-ac76-4a20-bcdc-dc13fceaaabc","Type":"ContainerDied","Data":"8a119571f5a89bd335b8d7b6704849ff683df123c4584a99e927a0208f612562"} Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.477367 4922 scope.go:117] "RemoveContainer" containerID="c01af40e744cf71456441bd5823419e79045d9e7d1127d39435a58abece52700" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.477367 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.511910 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.515762 4922 scope.go:117] "RemoveContainer" containerID="2e5495a2cec9e3011bb6865e8f88624777bd0b56e6edf7452eab3590d846420b" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.524329 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.541615 4922 scope.go:117] "RemoveContainer" containerID="788589c18f9d24d3d0050d80f0262085550508caa9bdfde6126b1dea8af83c4c" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.547966 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:07 crc kubenswrapper[4922]: E1122 03:12:07.548401 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-central-agent" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548425 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-central-agent" Nov 22 03:12:07 crc kubenswrapper[4922]: E1122 03:12:07.548440 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="sg-core" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548449 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="sg-core" Nov 22 03:12:07 crc kubenswrapper[4922]: E1122 03:12:07.548475 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="proxy-httpd" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548484 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="proxy-httpd" Nov 22 03:12:07 crc kubenswrapper[4922]: E1122 03:12:07.548501 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-notification-agent" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548511 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-notification-agent" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548747 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="sg-core" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548769 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-notification-agent" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548787 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="proxy-httpd" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.548798 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" containerName="ceilometer-central-agent" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.554128 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.557179 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.557361 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.558275 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.570622 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.595638 4922 scope.go:117] "RemoveContainer" containerID="35bc425ba87ca523eb68fd449264576ca529b7063a2662d0cceea92c76b0a9d9" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.750503 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-config-data\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.750973 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.751018 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.752247 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-run-httpd\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.752495 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-scripts\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.756006 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4579l\" (UniqueName: \"kubernetes.io/projected/30ee4416-e52c-453f-bd19-5c28949860e2-kube-api-access-4579l\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.756065 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-log-httpd\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.756094 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.857794 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-config-data\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.857885 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.857925 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.858059 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-run-httpd\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.858094 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-scripts\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.858150 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4579l\" (UniqueName: \"kubernetes.io/projected/30ee4416-e52c-453f-bd19-5c28949860e2-kube-api-access-4579l\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.858190 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-log-httpd\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.858222 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.858677 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-run-httpd\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.860522 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-log-httpd\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.876483 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.876970 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.877036 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-scripts\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.877392 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-config-data\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.882348 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:07 crc kubenswrapper[4922]: I1122 03:12:07.887238 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4579l\" (UniqueName: \"kubernetes.io/projected/30ee4416-e52c-453f-bd19-5c28949860e2-kube-api-access-4579l\") pod \"ceilometer-0\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " pod="openstack/ceilometer-0" Nov 22 03:12:08 crc kubenswrapper[4922]: I1122 03:12:08.178600 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:08 crc kubenswrapper[4922]: I1122 03:12:08.617040 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:08 crc kubenswrapper[4922]: I1122 03:12:08.763538 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 03:12:08 crc kubenswrapper[4922]: I1122 03:12:08.797285 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:08 crc kubenswrapper[4922]: I1122 03:12:08.798250 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:09 crc kubenswrapper[4922]: I1122 03:12:09.331531 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07379147-ac76-4a20-bcdc-dc13fceaaabc" path="/var/lib/kubelet/pods/07379147-ac76-4a20-bcdc-dc13fceaaabc/volumes" Nov 22 03:12:09 crc kubenswrapper[4922]: I1122 03:12:09.498709 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerStarted","Data":"3661b71adf299f1ddff6d54ef161a502f62c0e0ca96cf4b6684a0c81781d477b"} Nov 22 03:12:09 crc kubenswrapper[4922]: I1122 03:12:09.498795 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerStarted","Data":"4736b91a9b973cade0e2545a2026b486473b2adf4afb849ca00e5e2f841f105f"} Nov 22 03:12:09 crc kubenswrapper[4922]: I1122 03:12:09.879079 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:09 crc kubenswrapper[4922]: I1122 03:12:09.879111 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.179:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:10 crc kubenswrapper[4922]: I1122 03:12:10.514305 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerStarted","Data":"1aefe3c7294ae5a708fd99da6256375e039d9c9fabff543a5cb85e3dc1fc5d8f"} Nov 22 03:12:11 crc kubenswrapper[4922]: I1122 03:12:11.528048 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerStarted","Data":"a313769a81ad99932bf1f65d700359fdb73f630b2495c27d56a7c67fe86017ed"} Nov 22 03:12:12 crc kubenswrapper[4922]: I1122 03:12:12.086537 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 03:12:13 crc kubenswrapper[4922]: I1122 03:12:13.562024 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerStarted","Data":"bb6b139c466e3b0c01d6cd6968cf566372a2a76578cb702b4a1ed17fab3dc784"} Nov 22 03:12:13 crc kubenswrapper[4922]: I1122 03:12:13.562912 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:12:13 crc kubenswrapper[4922]: I1122 03:12:13.603837 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.792250977 podStartE2EDuration="6.603808292s" podCreationTimestamp="2025-11-22 03:12:07 +0000 UTC" firstStartedPulling="2025-11-22 03:12:08.628158394 +0000 UTC m=+1164.666680316" lastFinishedPulling="2025-11-22 03:12:12.439715709 +0000 UTC m=+1168.478237631" observedRunningTime="2025-11-22 03:12:13.586935368 +0000 UTC m=+1169.625457280" watchObservedRunningTime="2025-11-22 03:12:13.603808292 +0000 UTC m=+1169.642330214" Nov 22 03:12:15 crc kubenswrapper[4922]: I1122 03:12:15.686466 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:12:15 crc kubenswrapper[4922]: I1122 03:12:15.705225 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:12:15 crc kubenswrapper[4922]: I1122 03:12:15.730511 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:12:16 crc kubenswrapper[4922]: I1122 03:12:16.601022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.612064 4922 generic.go:334] "Generic (PLEG): container finished" podID="64048346-bcee-40da-af93-b7fbb844c1f9" containerID="2407c19e18d8987997a9fb965adc8f6148e8e8b87da71df2007cc8bea77bdef9" exitCode=137 Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.612183 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64048346-bcee-40da-af93-b7fbb844c1f9","Type":"ContainerDied","Data":"2407c19e18d8987997a9fb965adc8f6148e8e8b87da71df2007cc8bea77bdef9"} Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.613243 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"64048346-bcee-40da-af93-b7fbb844c1f9","Type":"ContainerDied","Data":"5d36520fab4067c9e11201540da4557b5f32b4d8534e8ca458c73353883f40dd"} Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.613297 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d36520fab4067c9e11201540da4557b5f32b4d8534e8ca458c73353883f40dd" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.657247 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.796271 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-config-data\") pod \"64048346-bcee-40da-af93-b7fbb844c1f9\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.796320 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-combined-ca-bundle\") pod \"64048346-bcee-40da-af93-b7fbb844c1f9\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.796430 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf5lq\" (UniqueName: \"kubernetes.io/projected/64048346-bcee-40da-af93-b7fbb844c1f9-kube-api-access-nf5lq\") pod \"64048346-bcee-40da-af93-b7fbb844c1f9\" (UID: \"64048346-bcee-40da-af93-b7fbb844c1f9\") " Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.800299 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.801313 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.802103 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.803434 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.810091 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64048346-bcee-40da-af93-b7fbb844c1f9-kube-api-access-nf5lq" (OuterVolumeSpecName: "kube-api-access-nf5lq") pod "64048346-bcee-40da-af93-b7fbb844c1f9" (UID: "64048346-bcee-40da-af93-b7fbb844c1f9"). InnerVolumeSpecName "kube-api-access-nf5lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.837199 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-config-data" (OuterVolumeSpecName: "config-data") pod "64048346-bcee-40da-af93-b7fbb844c1f9" (UID: "64048346-bcee-40da-af93-b7fbb844c1f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.843109 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64048346-bcee-40da-af93-b7fbb844c1f9" (UID: "64048346-bcee-40da-af93-b7fbb844c1f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.899107 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.899149 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64048346-bcee-40da-af93-b7fbb844c1f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:18 crc kubenswrapper[4922]: I1122 03:12:18.899165 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf5lq\" (UniqueName: \"kubernetes.io/projected/64048346-bcee-40da-af93-b7fbb844c1f9-kube-api-access-nf5lq\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.622786 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.623172 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.635915 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.662081 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.674919 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.711734 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:19 crc kubenswrapper[4922]: E1122 03:12:19.712313 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64048346-bcee-40da-af93-b7fbb844c1f9" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.712341 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="64048346-bcee-40da-af93-b7fbb844c1f9" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.712625 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="64048346-bcee-40da-af93-b7fbb844c1f9" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.713397 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.718386 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.718447 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.718788 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.725412 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.815694 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.815785 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.815835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.815869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv4fn\" (UniqueName: \"kubernetes.io/projected/d5aef40f-433d-4415-9c69-50ab337097f0-kube-api-access-mv4fn\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.815912 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.851698 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-zfbgm"] Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.863545 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-zfbgm"] Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.863681 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.917770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.917885 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98zl\" (UniqueName: \"kubernetes.io/projected/2a8adb5c-9b6c-4398-96b9-e67328810310-kube-api-access-s98zl\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.917919 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.917956 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.917987 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv4fn\" (UniqueName: \"kubernetes.io/projected/d5aef40f-433d-4415-9c69-50ab337097f0-kube-api-access-mv4fn\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.918004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.918038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.918059 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-config\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.918077 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-dns-svc\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.918093 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.922257 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.923315 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.927710 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.941575 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5aef40f-433d-4415-9c69-50ab337097f0-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:19 crc kubenswrapper[4922]: I1122 03:12:19.946435 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv4fn\" (UniqueName: \"kubernetes.io/projected/d5aef40f-433d-4415-9c69-50ab337097f0-kube-api-access-mv4fn\") pod \"nova-cell1-novncproxy-0\" (UID: \"d5aef40f-433d-4415-9c69-50ab337097f0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.019777 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.019843 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-config\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.019920 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-dns-svc\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.020783 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-config\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.020988 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-dns-svc\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.021084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.021926 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98zl\" (UniqueName: \"kubernetes.io/projected/2a8adb5c-9b6c-4398-96b9-e67328810310-kube-api-access-s98zl\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.021993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.022645 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.047808 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98zl\" (UniqueName: \"kubernetes.io/projected/2a8adb5c-9b6c-4398-96b9-e67328810310-kube-api-access-s98zl\") pod \"dnsmasq-dns-5b856c5697-zfbgm\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.087959 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.191345 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.529914 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.631970 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5aef40f-433d-4415-9c69-50ab337097f0","Type":"ContainerStarted","Data":"7beb07efeeb2309082daac25856dc7ba5276b50a668feac21bcadaa1b29f992d"} Nov 22 03:12:20 crc kubenswrapper[4922]: I1122 03:12:20.692925 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-zfbgm"] Nov 22 03:12:20 crc kubenswrapper[4922]: W1122 03:12:20.702282 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8adb5c_9b6c_4398_96b9_e67328810310.slice/crio-c0f863684c44696882b2eb5cd9bc7e1d7ccb192dcd8e5f741ff9c32d82b3e5b5 WatchSource:0}: Error finding container c0f863684c44696882b2eb5cd9bc7e1d7ccb192dcd8e5f741ff9c32d82b3e5b5: Status 404 returned error can't find the container with id c0f863684c44696882b2eb5cd9bc7e1d7ccb192dcd8e5f741ff9c32d82b3e5b5 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.044117 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.044703 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-central-agent" containerID="cri-o://3661b71adf299f1ddff6d54ef161a502f62c0e0ca96cf4b6684a0c81781d477b" gracePeriod=30 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.044746 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="sg-core" containerID="cri-o://a313769a81ad99932bf1f65d700359fdb73f630b2495c27d56a7c67fe86017ed" gracePeriod=30 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.044793 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-notification-agent" containerID="cri-o://1aefe3c7294ae5a708fd99da6256375e039d9c9fabff543a5cb85e3dc1fc5d8f" gracePeriod=30 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.044837 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="proxy-httpd" containerID="cri-o://bb6b139c466e3b0c01d6cd6968cf566372a2a76578cb702b4a1ed17fab3dc784" gracePeriod=30 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.309906 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64048346-bcee-40da-af93-b7fbb844c1f9" path="/var/lib/kubelet/pods/64048346-bcee-40da-af93-b7fbb844c1f9/volumes" Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.651470 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d5aef40f-433d-4415-9c69-50ab337097f0","Type":"ContainerStarted","Data":"f067d8b536b546f5b9d0aab2259e402e00aa9783806a4206c62e62a16b3f00cd"} Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.661797 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerID="7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6" exitCode=0 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.661908 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" event={"ID":"2a8adb5c-9b6c-4398-96b9-e67328810310","Type":"ContainerDied","Data":"7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6"} Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.661940 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" event={"ID":"2a8adb5c-9b6c-4398-96b9-e67328810310","Type":"ContainerStarted","Data":"c0f863684c44696882b2eb5cd9bc7e1d7ccb192dcd8e5f741ff9c32d82b3e5b5"} Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.674497 4922 generic.go:334] "Generic (PLEG): container finished" podID="30ee4416-e52c-453f-bd19-5c28949860e2" containerID="bb6b139c466e3b0c01d6cd6968cf566372a2a76578cb702b4a1ed17fab3dc784" exitCode=0 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.674542 4922 generic.go:334] "Generic (PLEG): container finished" podID="30ee4416-e52c-453f-bd19-5c28949860e2" containerID="a313769a81ad99932bf1f65d700359fdb73f630b2495c27d56a7c67fe86017ed" exitCode=2 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.674550 4922 generic.go:334] "Generic (PLEG): container finished" podID="30ee4416-e52c-453f-bd19-5c28949860e2" containerID="3661b71adf299f1ddff6d54ef161a502f62c0e0ca96cf4b6684a0c81781d477b" exitCode=0 Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.675795 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerDied","Data":"bb6b139c466e3b0c01d6cd6968cf566372a2a76578cb702b4a1ed17fab3dc784"} Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.675830 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerDied","Data":"a313769a81ad99932bf1f65d700359fdb73f630b2495c27d56a7c67fe86017ed"} Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.675845 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerDied","Data":"3661b71adf299f1ddff6d54ef161a502f62c0e0ca96cf4b6684a0c81781d477b"} Nov 22 03:12:21 crc kubenswrapper[4922]: I1122 03:12:21.697120 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.697095573 podStartE2EDuration="2.697095573s" podCreationTimestamp="2025-11-22 03:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:21.677638337 +0000 UTC m=+1177.716160269" watchObservedRunningTime="2025-11-22 03:12:21.697095573 +0000 UTC m=+1177.735617485" Nov 22 03:12:22 crc kubenswrapper[4922]: I1122 03:12:22.572925 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:22 crc kubenswrapper[4922]: I1122 03:12:22.684680 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-log" containerID="cri-o://be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60" gracePeriod=30 Nov 22 03:12:22 crc kubenswrapper[4922]: I1122 03:12:22.685026 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-api" containerID="cri-o://6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503" gracePeriod=30 Nov 22 03:12:22 crc kubenswrapper[4922]: I1122 03:12:22.685025 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" event={"ID":"2a8adb5c-9b6c-4398-96b9-e67328810310","Type":"ContainerStarted","Data":"a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def"} Nov 22 03:12:22 crc kubenswrapper[4922]: I1122 03:12:22.685277 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:22 crc kubenswrapper[4922]: I1122 03:12:22.708202 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" podStartSLOduration=3.708187541 podStartE2EDuration="3.708187541s" podCreationTimestamp="2025-11-22 03:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:22.704635566 +0000 UTC m=+1178.743157458" watchObservedRunningTime="2025-11-22 03:12:22.708187541 +0000 UTC m=+1178.746709433" Nov 22 03:12:23 crc kubenswrapper[4922]: I1122 03:12:23.694920 4922 generic.go:334] "Generic (PLEG): container finished" podID="490ccaff-a6f3-4e90-853f-74530b743392" containerID="be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60" exitCode=143 Nov 22 03:12:23 crc kubenswrapper[4922]: I1122 03:12:23.697215 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"490ccaff-a6f3-4e90-853f-74530b743392","Type":"ContainerDied","Data":"be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60"} Nov 22 03:12:25 crc kubenswrapper[4922]: I1122 03:12:25.088675 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:25 crc kubenswrapper[4922]: I1122 03:12:25.729199 4922 generic.go:334] "Generic (PLEG): container finished" podID="30ee4416-e52c-453f-bd19-5c28949860e2" containerID="1aefe3c7294ae5a708fd99da6256375e039d9c9fabff543a5cb85e3dc1fc5d8f" exitCode=0 Nov 22 03:12:25 crc kubenswrapper[4922]: I1122 03:12:25.729293 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerDied","Data":"1aefe3c7294ae5a708fd99da6256375e039d9c9fabff543a5cb85e3dc1fc5d8f"} Nov 22 03:12:25 crc kubenswrapper[4922]: I1122 03:12:25.983952 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-log-httpd\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080280 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-combined-ca-bundle\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-config-data\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080397 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-run-httpd\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080465 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-ceilometer-tls-certs\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080516 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-sg-core-conf-yaml\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080632 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4579l\" (UniqueName: \"kubernetes.io/projected/30ee4416-e52c-453f-bd19-5c28949860e2-kube-api-access-4579l\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.080680 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-scripts\") pod \"30ee4416-e52c-453f-bd19-5c28949860e2\" (UID: \"30ee4416-e52c-453f-bd19-5c28949860e2\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.081078 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.081355 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.081569 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.096089 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-scripts" (OuterVolumeSpecName: "scripts") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.096137 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ee4416-e52c-453f-bd19-5c28949860e2-kube-api-access-4579l" (OuterVolumeSpecName: "kube-api-access-4579l") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "kube-api-access-4579l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.142045 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.153502 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.182522 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/30ee4416-e52c-453f-bd19-5c28949860e2-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.182553 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.182567 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.182577 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4579l\" (UniqueName: \"kubernetes.io/projected/30ee4416-e52c-453f-bd19-5c28949860e2-kube-api-access-4579l\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.182587 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.202410 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.208018 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-config-data" (OuterVolumeSpecName: "config-data") pod "30ee4416-e52c-453f-bd19-5c28949860e2" (UID: "30ee4416-e52c-453f-bd19-5c28949860e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.284316 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.284350 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ee4416-e52c-453f-bd19-5c28949860e2-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.315964 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.486985 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-combined-ca-bundle\") pod \"490ccaff-a6f3-4e90-853f-74530b743392\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.487088 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnghc\" (UniqueName: \"kubernetes.io/projected/490ccaff-a6f3-4e90-853f-74530b743392-kube-api-access-pnghc\") pod \"490ccaff-a6f3-4e90-853f-74530b743392\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.487119 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-config-data\") pod \"490ccaff-a6f3-4e90-853f-74530b743392\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.487238 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490ccaff-a6f3-4e90-853f-74530b743392-logs\") pod \"490ccaff-a6f3-4e90-853f-74530b743392\" (UID: \"490ccaff-a6f3-4e90-853f-74530b743392\") " Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.487783 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/490ccaff-a6f3-4e90-853f-74530b743392-logs" (OuterVolumeSpecName: "logs") pod "490ccaff-a6f3-4e90-853f-74530b743392" (UID: "490ccaff-a6f3-4e90-853f-74530b743392"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.501253 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490ccaff-a6f3-4e90-853f-74530b743392-kube-api-access-pnghc" (OuterVolumeSpecName: "kube-api-access-pnghc") pod "490ccaff-a6f3-4e90-853f-74530b743392" (UID: "490ccaff-a6f3-4e90-853f-74530b743392"). InnerVolumeSpecName "kube-api-access-pnghc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.527529 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "490ccaff-a6f3-4e90-853f-74530b743392" (UID: "490ccaff-a6f3-4e90-853f-74530b743392"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.529518 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-config-data" (OuterVolumeSpecName: "config-data") pod "490ccaff-a6f3-4e90-853f-74530b743392" (UID: "490ccaff-a6f3-4e90-853f-74530b743392"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.589338 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnghc\" (UniqueName: \"kubernetes.io/projected/490ccaff-a6f3-4e90-853f-74530b743392-kube-api-access-pnghc\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.589675 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.589686 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/490ccaff-a6f3-4e90-853f-74530b743392-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.589694 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490ccaff-a6f3-4e90-853f-74530b743392-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.739878 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"30ee4416-e52c-453f-bd19-5c28949860e2","Type":"ContainerDied","Data":"4736b91a9b973cade0e2545a2026b486473b2adf4afb849ca00e5e2f841f105f"} Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.739924 4922 scope.go:117] "RemoveContainer" containerID="bb6b139c466e3b0c01d6cd6968cf566372a2a76578cb702b4a1ed17fab3dc784" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.740038 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.744810 4922 generic.go:334] "Generic (PLEG): container finished" podID="490ccaff-a6f3-4e90-853f-74530b743392" containerID="6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503" exitCode=0 Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.744865 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"490ccaff-a6f3-4e90-853f-74530b743392","Type":"ContainerDied","Data":"6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503"} Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.744916 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"490ccaff-a6f3-4e90-853f-74530b743392","Type":"ContainerDied","Data":"51bdc21846a8915677eba898142c7324f0492e87a54eca721bc964748d1b45b7"} Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.744972 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.789184 4922 scope.go:117] "RemoveContainer" containerID="a313769a81ad99932bf1f65d700359fdb73f630b2495c27d56a7c67fe86017ed" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.813520 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.840530 4922 scope.go:117] "RemoveContainer" containerID="1aefe3c7294ae5a708fd99da6256375e039d9c9fabff543a5cb85e3dc1fc5d8f" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.845327 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.859774 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.863534 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870112 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.870577 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-central-agent" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870599 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-central-agent" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.870617 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="proxy-httpd" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870627 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="proxy-httpd" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.870646 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="sg-core" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870654 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="sg-core" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.870676 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-log" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870682 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-log" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.870693 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-api" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870699 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-api" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.870714 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-notification-agent" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870720 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-notification-agent" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870928 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="proxy-httpd" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870947 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-log" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870970 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-notification-agent" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870982 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="ceilometer-central-agent" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.870994 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" containerName="sg-core" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.871009 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="490ccaff-a6f3-4e90-853f-74530b743392" containerName="nova-api-api" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.873692 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.875584 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.876215 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.876264 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.883919 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.890437 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.895067 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.895835 4922 scope.go:117] "RemoveContainer" containerID="3661b71adf299f1ddff6d54ef161a502f62c0e0ca96cf4b6684a0c81781d477b" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.899665 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.899696 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.899735 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.900111 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.925129 4922 scope.go:117] "RemoveContainer" containerID="6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.940583 4922 scope.go:117] "RemoveContainer" containerID="be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.955888 4922 scope.go:117] "RemoveContainer" containerID="6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.956324 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503\": container with ID starting with 6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503 not found: ID does not exist" containerID="6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.956357 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503"} err="failed to get container status \"6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503\": rpc error: code = NotFound desc = could not find container \"6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503\": container with ID starting with 6cbf2331a59d6d5c3ab41fa885d2509280ebdbe670b1d6ed48bef969d28b1503 not found: ID does not exist" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.956380 4922 scope.go:117] "RemoveContainer" containerID="be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60" Nov 22 03:12:26 crc kubenswrapper[4922]: E1122 03:12:26.957069 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60\": container with ID starting with be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60 not found: ID does not exist" containerID="be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.957091 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60"} err="failed to get container status \"be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60\": rpc error: code = NotFound desc = could not find container \"be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60\": container with ID starting with be2921767a053915b15ff3badb022730be521c19eb1a3db9a09c59b82a9afd60 not found: ID does not exist" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996485 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996608 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkf5h\" (UniqueName: \"kubernetes.io/projected/d2a9696d-15a2-484f-b917-ecc9b30644ac-kube-api-access-dkf5h\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996661 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-log-httpd\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996704 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-run-httpd\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996789 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996838 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996945 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-config-data\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.996997 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a9696d-15a2-484f-b917-ecc9b30644ac-logs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.997048 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.997086 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-scripts\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.997129 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.997176 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85tz5\" (UniqueName: \"kubernetes.io/projected/6f124e2e-3c52-46dc-8bb4-e931372a83eb-kube-api-access-85tz5\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.997217 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:26 crc kubenswrapper[4922]: I1122 03:12:26.997268 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-config-data\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098588 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098634 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098650 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-config-data\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098679 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a9696d-15a2-484f-b917-ecc9b30644ac-logs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098698 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098720 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-scripts\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098737 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098757 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85tz5\" (UniqueName: \"kubernetes.io/projected/6f124e2e-3c52-46dc-8bb4-e931372a83eb-kube-api-access-85tz5\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098780 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098808 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-config-data\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098835 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098896 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkf5h\" (UniqueName: \"kubernetes.io/projected/d2a9696d-15a2-484f-b917-ecc9b30644ac-kube-api-access-dkf5h\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098915 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-log-httpd\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.098944 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-run-httpd\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.099330 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-run-httpd\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.100114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a9696d-15a2-484f-b917-ecc9b30644ac-logs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.100223 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-log-httpd\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.104192 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.104568 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-public-tls-certs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.104909 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.105054 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-config-data\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.105440 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-scripts\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.105487 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.106259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.114104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.129824 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkf5h\" (UniqueName: \"kubernetes.io/projected/d2a9696d-15a2-484f-b917-ecc9b30644ac-kube-api-access-dkf5h\") pod \"nova-api-0\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.141505 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-config-data\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.146179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85tz5\" (UniqueName: \"kubernetes.io/projected/6f124e2e-3c52-46dc-8bb4-e931372a83eb-kube-api-access-85tz5\") pod \"ceilometer-0\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.192763 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.208558 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.358327 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ee4416-e52c-453f-bd19-5c28949860e2" path="/var/lib/kubelet/pods/30ee4416-e52c-453f-bd19-5c28949860e2/volumes" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.361420 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490ccaff-a6f3-4e90-853f-74530b743392" path="/var/lib/kubelet/pods/490ccaff-a6f3-4e90-853f-74530b743392/volumes" Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.728934 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:12:27 crc kubenswrapper[4922]: W1122 03:12:27.743594 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f124e2e_3c52_46dc_8bb4_e931372a83eb.slice/crio-13a643d27145d35d6a11635063c639af3ae4e6a87ed2396a1306fc643e618a84 WatchSource:0}: Error finding container 13a643d27145d35d6a11635063c639af3ae4e6a87ed2396a1306fc643e618a84: Status 404 returned error can't find the container with id 13a643d27145d35d6a11635063c639af3ae4e6a87ed2396a1306fc643e618a84 Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.770538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerStarted","Data":"13a643d27145d35d6a11635063c639af3ae4e6a87ed2396a1306fc643e618a84"} Nov 22 03:12:27 crc kubenswrapper[4922]: I1122 03:12:27.806012 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:27 crc kubenswrapper[4922]: W1122 03:12:27.813403 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2a9696d_15a2_484f_b917_ecc9b30644ac.slice/crio-87d31ca44cbd41352dee3c5109243ecebb40e114358818555b1c0fd162789f04 WatchSource:0}: Error finding container 87d31ca44cbd41352dee3c5109243ecebb40e114358818555b1c0fd162789f04: Status 404 returned error can't find the container with id 87d31ca44cbd41352dee3c5109243ecebb40e114358818555b1c0fd162789f04 Nov 22 03:12:28 crc kubenswrapper[4922]: I1122 03:12:28.786891 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerStarted","Data":"870a4b67e520d473b30a9deab7fb66faf7e9d012a26d9e12b2a023d47f6744fe"} Nov 22 03:12:28 crc kubenswrapper[4922]: I1122 03:12:28.788974 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2a9696d-15a2-484f-b917-ecc9b30644ac","Type":"ContainerStarted","Data":"8544d059b2433686930f1d823355cbe1998c54dbdb4c39f3d8c5cc270dac7ec1"} Nov 22 03:12:28 crc kubenswrapper[4922]: I1122 03:12:28.789044 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2a9696d-15a2-484f-b917-ecc9b30644ac","Type":"ContainerStarted","Data":"db9ad40a3e0493ed3f36c601d53c811baaf0cd0feb7ad8fa7fb9cad8bb2fb1f6"} Nov 22 03:12:28 crc kubenswrapper[4922]: I1122 03:12:28.789065 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2a9696d-15a2-484f-b917-ecc9b30644ac","Type":"ContainerStarted","Data":"87d31ca44cbd41352dee3c5109243ecebb40e114358818555b1c0fd162789f04"} Nov 22 03:12:28 crc kubenswrapper[4922]: I1122 03:12:28.823410 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.823390702 podStartE2EDuration="2.823390702s" podCreationTimestamp="2025-11-22 03:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:28.814456428 +0000 UTC m=+1184.852978320" watchObservedRunningTime="2025-11-22 03:12:28.823390702 +0000 UTC m=+1184.861912594" Nov 22 03:12:29 crc kubenswrapper[4922]: I1122 03:12:29.804183 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerStarted","Data":"e87fb80552e8c9f71547693bedb02798249bc7d23a2b9626811a3f726d469327"} Nov 22 03:12:29 crc kubenswrapper[4922]: I1122 03:12:29.946089 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.089815 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.112009 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.194972 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.270365 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-gzqf2"] Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.270997 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerName="dnsmasq-dns" containerID="cri-o://5e3fe0173c097deab6955e184404673175c9385dacd5472b97e131f0d2081f0b" gracePeriod=10 Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.817040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerStarted","Data":"80f3c6edbcae8cec2067b4a77ed102ed12782e11395cb883e59520eebfb0c11a"} Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.820804 4922 generic.go:334] "Generic (PLEG): container finished" podID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerID="5e3fe0173c097deab6955e184404673175c9385dacd5472b97e131f0d2081f0b" exitCode=0 Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.822068 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" event={"ID":"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e","Type":"ContainerDied","Data":"5e3fe0173c097deab6955e184404673175c9385dacd5472b97e131f0d2081f0b"} Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.822120 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" event={"ID":"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e","Type":"ContainerDied","Data":"4e821fe3f0f765edc8677799770547e537cccf511d02423653e9e4fb0bacbe55"} Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.822134 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e821fe3f0f765edc8677799770547e537cccf511d02423653e9e4fb0bacbe55" Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.838891 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.912827 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.984944 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g74k6\" (UniqueName: \"kubernetes.io/projected/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-kube-api-access-g74k6\") pod \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.985027 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-config\") pod \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.985077 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-sb\") pod \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.985095 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-nb\") pod \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.985169 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-dns-svc\") pod \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\" (UID: \"1a37b1ca-22a8-46f0-93a8-d41ac6fa407e\") " Nov 22 03:12:30 crc kubenswrapper[4922]: I1122 03:12:30.993108 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-kube-api-access-g74k6" (OuterVolumeSpecName: "kube-api-access-g74k6") pod "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" (UID: "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e"). InnerVolumeSpecName "kube-api-access-g74k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.078046 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-config" (OuterVolumeSpecName: "config") pod "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" (UID: "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.084013 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" (UID: "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.084210 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" (UID: "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.085466 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" (UID: "1a37b1ca-22a8-46f0-93a8-d41ac6fa407e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.087720 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.088001 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.088011 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.088020 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g74k6\" (UniqueName: \"kubernetes.io/projected/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-kube-api-access-g74k6\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.088031 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.089231 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vj8vq"] Nov 22 03:12:31 crc kubenswrapper[4922]: E1122 03:12:31.089615 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerName="init" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.089634 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerName="init" Nov 22 03:12:31 crc kubenswrapper[4922]: E1122 03:12:31.089652 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerName="dnsmasq-dns" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.089660 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerName="dnsmasq-dns" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.089886 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" containerName="dnsmasq-dns" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.090475 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.092478 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.092690 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.097665 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vj8vq"] Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.189630 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.189690 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-scripts\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.189725 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwzhv\" (UniqueName: \"kubernetes.io/projected/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-kube-api-access-vwzhv\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.189741 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-config-data\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.290772 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-scripts\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.290995 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwzhv\" (UniqueName: \"kubernetes.io/projected/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-kube-api-access-vwzhv\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.291129 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-config-data\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.291385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.303584 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-scripts\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.303628 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.303659 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-config-data\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.310792 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwzhv\" (UniqueName: \"kubernetes.io/projected/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-kube-api-access-vwzhv\") pod \"nova-cell1-cell-mapping-vj8vq\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.484084 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.839353 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-gzqf2" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.840924 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerStarted","Data":"430ff11b99526bfac61a4ae87d2e4c9bfbc3b703140f4279d0b366b457e90a7c"} Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.840962 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.880314 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.610743619 podStartE2EDuration="5.880297132s" podCreationTimestamp="2025-11-22 03:12:26 +0000 UTC" firstStartedPulling="2025-11-22 03:12:27.751215401 +0000 UTC m=+1183.789737313" lastFinishedPulling="2025-11-22 03:12:31.020768944 +0000 UTC m=+1187.059290826" observedRunningTime="2025-11-22 03:12:31.866328367 +0000 UTC m=+1187.904850259" watchObservedRunningTime="2025-11-22 03:12:31.880297132 +0000 UTC m=+1187.918819024" Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.884460 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-gzqf2"] Nov 22 03:12:31 crc kubenswrapper[4922]: I1122 03:12:31.890473 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-gzqf2"] Nov 22 03:12:32 crc kubenswrapper[4922]: I1122 03:12:32.036460 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vj8vq"] Nov 22 03:12:32 crc kubenswrapper[4922]: I1122 03:12:32.850043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vj8vq" event={"ID":"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f","Type":"ContainerStarted","Data":"3c04f5fbc0ec3d803529fc07fd9b51790a8ab8d7deebcaad32653894f1db4e27"} Nov 22 03:12:32 crc kubenswrapper[4922]: I1122 03:12:32.850358 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vj8vq" event={"ID":"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f","Type":"ContainerStarted","Data":"0dc6b884b50b9b5f0e14b6a63079e7c085413e542ee8385639ff667def5da4d9"} Nov 22 03:12:32 crc kubenswrapper[4922]: I1122 03:12:32.869601 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vj8vq" podStartSLOduration=1.8695851879999998 podStartE2EDuration="1.869585188s" podCreationTimestamp="2025-11-22 03:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:32.866828362 +0000 UTC m=+1188.905350254" watchObservedRunningTime="2025-11-22 03:12:32.869585188 +0000 UTC m=+1188.908107100" Nov 22 03:12:33 crc kubenswrapper[4922]: I1122 03:12:33.313270 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a37b1ca-22a8-46f0-93a8-d41ac6fa407e" path="/var/lib/kubelet/pods/1a37b1ca-22a8-46f0-93a8-d41ac6fa407e/volumes" Nov 22 03:12:36 crc kubenswrapper[4922]: I1122 03:12:36.918790 4922 generic.go:334] "Generic (PLEG): container finished" podID="a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" containerID="3c04f5fbc0ec3d803529fc07fd9b51790a8ab8d7deebcaad32653894f1db4e27" exitCode=0 Nov 22 03:12:36 crc kubenswrapper[4922]: I1122 03:12:36.918900 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vj8vq" event={"ID":"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f","Type":"ContainerDied","Data":"3c04f5fbc0ec3d803529fc07fd9b51790a8ab8d7deebcaad32653894f1db4e27"} Nov 22 03:12:37 crc kubenswrapper[4922]: I1122 03:12:37.209147 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:37 crc kubenswrapper[4922]: I1122 03:12:37.209443 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.229094 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.229904 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.373205 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.437888 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-combined-ca-bundle\") pod \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.437981 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-config-data\") pod \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.438133 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwzhv\" (UniqueName: \"kubernetes.io/projected/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-kube-api-access-vwzhv\") pod \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.438288 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-scripts\") pod \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\" (UID: \"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f\") " Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.446960 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-scripts" (OuterVolumeSpecName: "scripts") pod "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" (UID: "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.462732 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-kube-api-access-vwzhv" (OuterVolumeSpecName: "kube-api-access-vwzhv") pod "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" (UID: "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f"). InnerVolumeSpecName "kube-api-access-vwzhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.488398 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-config-data" (OuterVolumeSpecName: "config-data") pod "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" (UID: "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.499797 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" (UID: "a3c79e72-6196-4e01-b0a5-8baf2bafbb0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.542156 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwzhv\" (UniqueName: \"kubernetes.io/projected/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-kube-api-access-vwzhv\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.542240 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.542256 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.542269 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.943883 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vj8vq" event={"ID":"a3c79e72-6196-4e01-b0a5-8baf2bafbb0f","Type":"ContainerDied","Data":"0dc6b884b50b9b5f0e14b6a63079e7c085413e542ee8385639ff667def5da4d9"} Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.943917 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc6b884b50b9b5f0e14b6a63079e7c085413e542ee8385639ff667def5da4d9" Nov 22 03:12:38 crc kubenswrapper[4922]: I1122 03:12:38.944032 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vj8vq" Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.145464 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.146442 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" containerName="nova-scheduler-scheduler" containerID="cri-o://ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7" gracePeriod=30 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.154536 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.154802 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-log" containerID="cri-o://db9ad40a3e0493ed3f36c601d53c811baaf0cd0feb7ad8fa7fb9cad8bb2fb1f6" gracePeriod=30 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.154971 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-api" containerID="cri-o://8544d059b2433686930f1d823355cbe1998c54dbdb4c39f3d8c5cc270dac7ec1" gracePeriod=30 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.217356 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.217908 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-log" containerID="cri-o://c8dc6ebdde5299e8b9ae6461cb7344bc7a2f34f328a5df33c4d13432931e9ed5" gracePeriod=30 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.218415 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-metadata" containerID="cri-o://d77fbf6725f3eaae7d31fc9019b32533f293e1dd190defb5563c01a36d37687c" gracePeriod=30 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.954246 4922 generic.go:334] "Generic (PLEG): container finished" podID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerID="c8dc6ebdde5299e8b9ae6461cb7344bc7a2f34f328a5df33c4d13432931e9ed5" exitCode=143 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.954304 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"696c4507-7b7c-4b17-8428-0ade9ffb17ed","Type":"ContainerDied","Data":"c8dc6ebdde5299e8b9ae6461cb7344bc7a2f34f328a5df33c4d13432931e9ed5"} Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.956326 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerID="db9ad40a3e0493ed3f36c601d53c811baaf0cd0feb7ad8fa7fb9cad8bb2fb1f6" exitCode=143 Nov 22 03:12:39 crc kubenswrapper[4922]: I1122 03:12:39.956347 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2a9696d-15a2-484f-b917-ecc9b30644ac","Type":"ContainerDied","Data":"db9ad40a3e0493ed3f36c601d53c811baaf0cd0feb7ad8fa7fb9cad8bb2fb1f6"} Nov 22 03:12:40 crc kubenswrapper[4922]: E1122 03:12:40.685020 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:12:40 crc kubenswrapper[4922]: E1122 03:12:40.687096 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:12:40 crc kubenswrapper[4922]: E1122 03:12:40.689096 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 03:12:40 crc kubenswrapper[4922]: E1122 03:12:40.689322 4922 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" containerName="nova-scheduler-scheduler" Nov 22 03:12:42 crc kubenswrapper[4922]: I1122 03:12:42.690246 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:58476->10.217.0.177:8775: read: connection reset by peer" Nov 22 03:12:42 crc kubenswrapper[4922]: I1122 03:12:42.690304 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.177:8775/\": read tcp 10.217.0.2:58478->10.217.0.177:8775: read: connection reset by peer" Nov 22 03:12:42 crc kubenswrapper[4922]: I1122 03:12:42.994158 4922 generic.go:334] "Generic (PLEG): container finished" podID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerID="d77fbf6725f3eaae7d31fc9019b32533f293e1dd190defb5563c01a36d37687c" exitCode=0 Nov 22 03:12:42 crc kubenswrapper[4922]: I1122 03:12:42.994222 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"696c4507-7b7c-4b17-8428-0ade9ffb17ed","Type":"ContainerDied","Data":"d77fbf6725f3eaae7d31fc9019b32533f293e1dd190defb5563c01a36d37687c"} Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.925838 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.952093 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-nova-metadata-tls-certs\") pod \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.952187 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-combined-ca-bundle\") pod \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.952252 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlvf\" (UniqueName: \"kubernetes.io/projected/696c4507-7b7c-4b17-8428-0ade9ffb17ed-kube-api-access-4vlvf\") pod \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.952288 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c4507-7b7c-4b17-8428-0ade9ffb17ed-logs\") pod \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.952444 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-config-data\") pod \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\" (UID: \"696c4507-7b7c-4b17-8428-0ade9ffb17ed\") " Nov 22 03:12:43 crc kubenswrapper[4922]: I1122 03:12:43.954814 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696c4507-7b7c-4b17-8428-0ade9ffb17ed-logs" (OuterVolumeSpecName: "logs") pod "696c4507-7b7c-4b17-8428-0ade9ffb17ed" (UID: "696c4507-7b7c-4b17-8428-0ade9ffb17ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.002901 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696c4507-7b7c-4b17-8428-0ade9ffb17ed-kube-api-access-4vlvf" (OuterVolumeSpecName: "kube-api-access-4vlvf") pod "696c4507-7b7c-4b17-8428-0ade9ffb17ed" (UID: "696c4507-7b7c-4b17-8428-0ade9ffb17ed"). InnerVolumeSpecName "kube-api-access-4vlvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.022179 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "696c4507-7b7c-4b17-8428-0ade9ffb17ed" (UID: "696c4507-7b7c-4b17-8428-0ade9ffb17ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.024510 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"696c4507-7b7c-4b17-8428-0ade9ffb17ed","Type":"ContainerDied","Data":"dbf589d0852d4e8495ae2976c778d12088adcac341536f267173d798e7dbf815"} Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.024611 4922 scope.go:117] "RemoveContainer" containerID="d77fbf6725f3eaae7d31fc9019b32533f293e1dd190defb5563c01a36d37687c" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.024527 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.029591 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-config-data" (OuterVolumeSpecName: "config-data") pod "696c4507-7b7c-4b17-8428-0ade9ffb17ed" (UID: "696c4507-7b7c-4b17-8428-0ade9ffb17ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.036177 4922 generic.go:334] "Generic (PLEG): container finished" podID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerID="8544d059b2433686930f1d823355cbe1998c54dbdb4c39f3d8c5cc270dac7ec1" exitCode=0 Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.036223 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2a9696d-15a2-484f-b917-ecc9b30644ac","Type":"ContainerDied","Data":"8544d059b2433686930f1d823355cbe1998c54dbdb4c39f3d8c5cc270dac7ec1"} Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.054931 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.054956 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlvf\" (UniqueName: \"kubernetes.io/projected/696c4507-7b7c-4b17-8428-0ade9ffb17ed-kube-api-access-4vlvf\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.054970 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696c4507-7b7c-4b17-8428-0ade9ffb17ed-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.054980 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.057092 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "696c4507-7b7c-4b17-8428-0ade9ffb17ed" (UID: "696c4507-7b7c-4b17-8428-0ade9ffb17ed"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.123812 4922 scope.go:117] "RemoveContainer" containerID="c8dc6ebdde5299e8b9ae6461cb7344bc7a2f34f328a5df33c4d13432931e9ed5" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.156151 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/696c4507-7b7c-4b17-8428-0ade9ffb17ed-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.207899 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.257333 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-combined-ca-bundle\") pod \"d2a9696d-15a2-484f-b917-ecc9b30644ac\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.257705 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a9696d-15a2-484f-b917-ecc9b30644ac-logs\") pod \"d2a9696d-15a2-484f-b917-ecc9b30644ac\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.257767 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-config-data\") pod \"d2a9696d-15a2-484f-b917-ecc9b30644ac\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.257827 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-public-tls-certs\") pod \"d2a9696d-15a2-484f-b917-ecc9b30644ac\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.258152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a9696d-15a2-484f-b917-ecc9b30644ac-logs" (OuterVolumeSpecName: "logs") pod "d2a9696d-15a2-484f-b917-ecc9b30644ac" (UID: "d2a9696d-15a2-484f-b917-ecc9b30644ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.258259 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkf5h\" (UniqueName: \"kubernetes.io/projected/d2a9696d-15a2-484f-b917-ecc9b30644ac-kube-api-access-dkf5h\") pod \"d2a9696d-15a2-484f-b917-ecc9b30644ac\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.258314 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-internal-tls-certs\") pod \"d2a9696d-15a2-484f-b917-ecc9b30644ac\" (UID: \"d2a9696d-15a2-484f-b917-ecc9b30644ac\") " Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.258724 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a9696d-15a2-484f-b917-ecc9b30644ac-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.261182 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a9696d-15a2-484f-b917-ecc9b30644ac-kube-api-access-dkf5h" (OuterVolumeSpecName: "kube-api-access-dkf5h") pod "d2a9696d-15a2-484f-b917-ecc9b30644ac" (UID: "d2a9696d-15a2-484f-b917-ecc9b30644ac"). InnerVolumeSpecName "kube-api-access-dkf5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.286812 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2a9696d-15a2-484f-b917-ecc9b30644ac" (UID: "d2a9696d-15a2-484f-b917-ecc9b30644ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.292676 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-config-data" (OuterVolumeSpecName: "config-data") pod "d2a9696d-15a2-484f-b917-ecc9b30644ac" (UID: "d2a9696d-15a2-484f-b917-ecc9b30644ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.304596 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d2a9696d-15a2-484f-b917-ecc9b30644ac" (UID: "d2a9696d-15a2-484f-b917-ecc9b30644ac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.315265 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d2a9696d-15a2-484f-b917-ecc9b30644ac" (UID: "d2a9696d-15a2-484f-b917-ecc9b30644ac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.352211 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.358246 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.360267 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.360296 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.360306 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.360315 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkf5h\" (UniqueName: \"kubernetes.io/projected/d2a9696d-15a2-484f-b917-ecc9b30644ac-kube-api-access-dkf5h\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.360324 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a9696d-15a2-484f-b917-ecc9b30644ac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.388162 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:44 crc kubenswrapper[4922]: E1122 03:12:44.388801 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" containerName="nova-manage" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.388831 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" containerName="nova-manage" Nov 22 03:12:44 crc kubenswrapper[4922]: E1122 03:12:44.388893 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-metadata" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.388906 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-metadata" Nov 22 03:12:44 crc kubenswrapper[4922]: E1122 03:12:44.388929 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-log" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.388941 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-log" Nov 22 03:12:44 crc kubenswrapper[4922]: E1122 03:12:44.388960 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-log" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.388972 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-log" Nov 22 03:12:44 crc kubenswrapper[4922]: E1122 03:12:44.389001 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-api" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.389013 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-api" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.389330 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-api" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.389367 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" containerName="nova-api-log" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.389395 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-metadata" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.389418 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" containerName="nova-manage" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.389462 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" containerName="nova-metadata-log" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.391097 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.393194 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.393218 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.397321 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.462247 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvnq\" (UniqueName: \"kubernetes.io/projected/4f4f2580-fec5-4d83-80a3-225f0bcc355a-kube-api-access-jlvnq\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.462742 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.462947 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.463129 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-config-data\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.463242 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4f2580-fec5-4d83-80a3-225f0bcc355a-logs\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.566041 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.566229 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.566336 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4f2580-fec5-4d83-80a3-225f0bcc355a-logs\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.566374 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-config-data\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.566447 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvnq\" (UniqueName: \"kubernetes.io/projected/4f4f2580-fec5-4d83-80a3-225f0bcc355a-kube-api-access-jlvnq\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.567157 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4f2580-fec5-4d83-80a3-225f0bcc355a-logs\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.571521 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.571653 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.573234 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4f2580-fec5-4d83-80a3-225f0bcc355a-config-data\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.596477 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvnq\" (UniqueName: \"kubernetes.io/projected/4f4f2580-fec5-4d83-80a3-225f0bcc355a-kube-api-access-jlvnq\") pod \"nova-metadata-0\" (UID: \"4f4f2580-fec5-4d83-80a3-225f0bcc355a\") " pod="openstack/nova-metadata-0" Nov 22 03:12:44 crc kubenswrapper[4922]: I1122 03:12:44.715680 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.057246 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2a9696d-15a2-484f-b917-ecc9b30644ac","Type":"ContainerDied","Data":"87d31ca44cbd41352dee3c5109243ecebb40e114358818555b1c0fd162789f04"} Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.057682 4922 scope.go:117] "RemoveContainer" containerID="8544d059b2433686930f1d823355cbe1998c54dbdb4c39f3d8c5cc270dac7ec1" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.057287 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.061668 4922 generic.go:334] "Generic (PLEG): container finished" podID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" containerID="ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7" exitCode=0 Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.061712 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"361c95c5-7eba-4a1a-9341-4e4da5eb8848","Type":"ContainerDied","Data":"ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7"} Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.108032 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.124273 4922 scope.go:117] "RemoveContainer" containerID="db9ad40a3e0493ed3f36c601d53c811baaf0cd0feb7ad8fa7fb9cad8bb2fb1f6" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.132670 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.151766 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.156655 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.159873 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.160223 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.162386 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.163116 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.244696 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.282737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.282792 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.282830 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljzq\" (UniqueName: \"kubernetes.io/projected/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-kube-api-access-hljzq\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.282882 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-config-data\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.282976 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-public-tls-certs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.283048 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-logs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.316756 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696c4507-7b7c-4b17-8428-0ade9ffb17ed" path="/var/lib/kubelet/pods/696c4507-7b7c-4b17-8428-0ade9ffb17ed/volumes" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.318042 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a9696d-15a2-484f-b917-ecc9b30644ac" path="/var/lib/kubelet/pods/d2a9696d-15a2-484f-b917-ecc9b30644ac/volumes" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.384928 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.385068 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljzq\" (UniqueName: \"kubernetes.io/projected/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-kube-api-access-hljzq\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.385113 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-config-data\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.385269 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-public-tls-certs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.385334 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-logs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.385359 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.386713 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-logs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.391265 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.391573 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.391908 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.391914 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.401139 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-public-tls-certs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.402758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-config-data\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.403235 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.407750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljzq\" (UniqueName: \"kubernetes.io/projected/0bccfb44-be98-4a57-9b1d-4e6d11cee15d-kube-api-access-hljzq\") pod \"nova-api-0\" (UID: \"0bccfb44-be98-4a57-9b1d-4e6d11cee15d\") " pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.485106 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.621918 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.690994 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-config-data\") pod \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.691054 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-combined-ca-bundle\") pod \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.691081 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcb5l\" (UniqueName: \"kubernetes.io/projected/361c95c5-7eba-4a1a-9341-4e4da5eb8848-kube-api-access-hcb5l\") pod \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\" (UID: \"361c95c5-7eba-4a1a-9341-4e4da5eb8848\") " Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.696801 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361c95c5-7eba-4a1a-9341-4e4da5eb8848-kube-api-access-hcb5l" (OuterVolumeSpecName: "kube-api-access-hcb5l") pod "361c95c5-7eba-4a1a-9341-4e4da5eb8848" (UID: "361c95c5-7eba-4a1a-9341-4e4da5eb8848"). InnerVolumeSpecName "kube-api-access-hcb5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.721908 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-config-data" (OuterVolumeSpecName: "config-data") pod "361c95c5-7eba-4a1a-9341-4e4da5eb8848" (UID: "361c95c5-7eba-4a1a-9341-4e4da5eb8848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.722799 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "361c95c5-7eba-4a1a-9341-4e4da5eb8848" (UID: "361c95c5-7eba-4a1a-9341-4e4da5eb8848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.794077 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.794124 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361c95c5-7eba-4a1a-9341-4e4da5eb8848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.794145 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcb5l\" (UniqueName: \"kubernetes.io/projected/361c95c5-7eba-4a1a-9341-4e4da5eb8848-kube-api-access-hcb5l\") on node \"crc\" DevicePath \"\"" Nov 22 03:12:45 crc kubenswrapper[4922]: I1122 03:12:45.950673 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 03:12:45 crc kubenswrapper[4922]: W1122 03:12:45.955943 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bccfb44_be98_4a57_9b1d_4e6d11cee15d.slice/crio-b4a20fbc6ac0a87cb208da03fcb5dfa05f1cd5139590794b2e19fe32d7c9d2fe WatchSource:0}: Error finding container b4a20fbc6ac0a87cb208da03fcb5dfa05f1cd5139590794b2e19fe32d7c9d2fe: Status 404 returned error can't find the container with id b4a20fbc6ac0a87cb208da03fcb5dfa05f1cd5139590794b2e19fe32d7c9d2fe Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.073253 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"361c95c5-7eba-4a1a-9341-4e4da5eb8848","Type":"ContainerDied","Data":"d463e2736be649724d39b1fb715dbfc729a5b241390c1ee4030c5fb8c42e83d6"} Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.073324 4922 scope.go:117] "RemoveContainer" containerID="ab9b717fe37a855875b201e1c34e6d3706553b19c461bef0244fce0ec1b35ce7" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.073272 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.080459 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f4f2580-fec5-4d83-80a3-225f0bcc355a","Type":"ContainerStarted","Data":"b3ff560221dc0b57befc4872b1b84d9b5afdec4626ab18afb9867590ac106110"} Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.080505 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f4f2580-fec5-4d83-80a3-225f0bcc355a","Type":"ContainerStarted","Data":"7e4fcf9c639c8b5f0ef528194591ad16fb301e9385c6a3189429ef675a423c95"} Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.080517 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f4f2580-fec5-4d83-80a3-225f0bcc355a","Type":"ContainerStarted","Data":"11f33dbf1a843fe7efc8fd07ba8d4c45d9e544669496b6e398d51a11e2624ea5"} Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.084225 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bccfb44-be98-4a57-9b1d-4e6d11cee15d","Type":"ContainerStarted","Data":"b4a20fbc6ac0a87cb208da03fcb5dfa05f1cd5139590794b2e19fe32d7c9d2fe"} Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.116137 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.130373 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.146722 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:46 crc kubenswrapper[4922]: E1122 03:12:46.147432 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" containerName="nova-scheduler-scheduler" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.147466 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" containerName="nova-scheduler-scheduler" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.147808 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" containerName="nova-scheduler-scheduler" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.150127 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.154006 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.160539 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.201933 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7d9d\" (UniqueName: \"kubernetes.io/projected/cd673a7b-3830-4c59-bf58-9d6d675f6e40-kube-api-access-x7d9d\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.202043 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd673a7b-3830-4c59-bf58-9d6d675f6e40-config-data\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.202216 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd673a7b-3830-4c59-bf58-9d6d675f6e40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.303646 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd673a7b-3830-4c59-bf58-9d6d675f6e40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.303772 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7d9d\" (UniqueName: \"kubernetes.io/projected/cd673a7b-3830-4c59-bf58-9d6d675f6e40-kube-api-access-x7d9d\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.303815 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd673a7b-3830-4c59-bf58-9d6d675f6e40-config-data\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.307203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd673a7b-3830-4c59-bf58-9d6d675f6e40-config-data\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.309400 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd673a7b-3830-4c59-bf58-9d6d675f6e40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.321320 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7d9d\" (UniqueName: \"kubernetes.io/projected/cd673a7b-3830-4c59-bf58-9d6d675f6e40-kube-api-access-x7d9d\") pod \"nova-scheduler-0\" (UID: \"cd673a7b-3830-4c59-bf58-9d6d675f6e40\") " pod="openstack/nova-scheduler-0" Nov 22 03:12:46 crc kubenswrapper[4922]: I1122 03:12:46.481821 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.027160 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.102807 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd673a7b-3830-4c59-bf58-9d6d675f6e40","Type":"ContainerStarted","Data":"b7ea18a2e5eb2c45df4db57d08995fb913683a72a634845a44b30d75ffe77e3d"} Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.107262 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bccfb44-be98-4a57-9b1d-4e6d11cee15d","Type":"ContainerStarted","Data":"80b5131c38bb3f546f8af4974aad7a59c603a6d06d21fe9e23400b3d2fd014e7"} Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.107290 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0bccfb44-be98-4a57-9b1d-4e6d11cee15d","Type":"ContainerStarted","Data":"4a04a8e01dc4a3758fc481a6ceef64a4e2f4ecd9c629af9c5e575e48c6da1776"} Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.157541 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.157516663 podStartE2EDuration="2.157516663s" podCreationTimestamp="2025-11-22 03:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:47.127903284 +0000 UTC m=+1203.166425186" watchObservedRunningTime="2025-11-22 03:12:47.157516663 +0000 UTC m=+1203.196038565" Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.178676 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.178651279 podStartE2EDuration="3.178651279s" podCreationTimestamp="2025-11-22 03:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:47.165652558 +0000 UTC m=+1203.204174450" watchObservedRunningTime="2025-11-22 03:12:47.178651279 +0000 UTC m=+1203.217173211" Nov 22 03:12:47 crc kubenswrapper[4922]: I1122 03:12:47.326834 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361c95c5-7eba-4a1a-9341-4e4da5eb8848" path="/var/lib/kubelet/pods/361c95c5-7eba-4a1a-9341-4e4da5eb8848/volumes" Nov 22 03:12:48 crc kubenswrapper[4922]: I1122 03:12:48.117294 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cd673a7b-3830-4c59-bf58-9d6d675f6e40","Type":"ContainerStarted","Data":"d9c147ca3cfc53626440e99996b1e541662573b5e20d42d07c0fae7c4dac468d"} Nov 22 03:12:48 crc kubenswrapper[4922]: I1122 03:12:48.135407 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.135388736 podStartE2EDuration="2.135388736s" podCreationTimestamp="2025-11-22 03:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:12:48.133113941 +0000 UTC m=+1204.171635843" watchObservedRunningTime="2025-11-22 03:12:48.135388736 +0000 UTC m=+1204.173910638" Nov 22 03:12:49 crc kubenswrapper[4922]: I1122 03:12:49.716213 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:49 crc kubenswrapper[4922]: I1122 03:12:49.716523 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 03:12:51 crc kubenswrapper[4922]: I1122 03:12:51.482659 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 03:12:54 crc kubenswrapper[4922]: I1122 03:12:54.716465 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:12:54 crc kubenswrapper[4922]: I1122 03:12:54.716949 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 03:12:55 crc kubenswrapper[4922]: I1122 03:12:55.485808 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:55 crc kubenswrapper[4922]: I1122 03:12:55.485897 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 03:12:55 crc kubenswrapper[4922]: I1122 03:12:55.729002 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f4f2580-fec5-4d83-80a3-225f0bcc355a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:55 crc kubenswrapper[4922]: I1122 03:12:55.729162 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f4f2580-fec5-4d83-80a3-225f0bcc355a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:56 crc kubenswrapper[4922]: I1122 03:12:56.483199 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 03:12:56 crc kubenswrapper[4922]: I1122 03:12:56.500152 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0bccfb44-be98-4a57-9b1d-4e6d11cee15d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:56 crc kubenswrapper[4922]: I1122 03:12:56.500572 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0bccfb44-be98-4a57-9b1d-4e6d11cee15d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.188:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 03:12:56 crc kubenswrapper[4922]: I1122 03:12:56.532127 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 03:12:57 crc kubenswrapper[4922]: I1122 03:12:57.221266 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:12:57 crc kubenswrapper[4922]: I1122 03:12:57.292402 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 03:13:04 crc kubenswrapper[4922]: I1122 03:13:04.727985 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:13:04 crc kubenswrapper[4922]: I1122 03:13:04.728466 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 03:13:04 crc kubenswrapper[4922]: I1122 03:13:04.741407 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:13:04 crc kubenswrapper[4922]: I1122 03:13:04.742304 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 03:13:05 crc kubenswrapper[4922]: I1122 03:13:05.571345 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:13:05 crc kubenswrapper[4922]: I1122 03:13:05.571974 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:13:05 crc kubenswrapper[4922]: I1122 03:13:05.573662 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 03:13:05 crc kubenswrapper[4922]: I1122 03:13:05.576099 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:13:06 crc kubenswrapper[4922]: I1122 03:13:06.337772 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 03:13:06 crc kubenswrapper[4922]: I1122 03:13:06.347267 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 03:13:14 crc kubenswrapper[4922]: I1122 03:13:14.595702 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:13:16 crc kubenswrapper[4922]: I1122 03:13:16.096406 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:13:18 crc kubenswrapper[4922]: I1122 03:13:18.874819 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="rabbitmq" containerID="cri-o://52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3" gracePeriod=604796 Nov 22 03:13:19 crc kubenswrapper[4922]: I1122 03:13:19.708174 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Nov 22 03:13:20 crc kubenswrapper[4922]: I1122 03:13:20.125928 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerName="rabbitmq" containerID="cri-o://d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131" gracePeriod=604796 Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.528516 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.532143 4922 generic.go:334] "Generic (PLEG): container finished" podID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerID="52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3" exitCode=0 Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.532236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"161bd1ea-2276-4a95-b0ad-304cc807d13f","Type":"ContainerDied","Data":"52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3"} Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.532277 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"161bd1ea-2276-4a95-b0ad-304cc807d13f","Type":"ContainerDied","Data":"921d46bffde48af9b5e317203caf31711b2506c93ad522372d7afd13c5369d38"} Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.532304 4922 scope.go:117] "RemoveContainer" containerID="52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.565537 4922 scope.go:117] "RemoveContainer" containerID="4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.635008 4922 scope.go:117] "RemoveContainer" containerID="52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3" Nov 22 03:13:25 crc kubenswrapper[4922]: E1122 03:13:25.635503 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3\": container with ID starting with 52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3 not found: ID does not exist" containerID="52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.635567 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3"} err="failed to get container status \"52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3\": rpc error: code = NotFound desc = could not find container \"52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3\": container with ID starting with 52cf84fde267e2a743a54be053b20b4f49a217e36e08becc55e6ca0ac57e87d3 not found: ID does not exist" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.635606 4922 scope.go:117] "RemoveContainer" containerID="4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd" Nov 22 03:13:25 crc kubenswrapper[4922]: E1122 03:13:25.635992 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd\": container with ID starting with 4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd not found: ID does not exist" containerID="4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.636020 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd"} err="failed to get container status \"4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd\": rpc error: code = NotFound desc = could not find container \"4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd\": container with ID starting with 4fcb44b5034d155200f58d1b2c38a0ff75e5cc566eceb71c9cb60aa7ee6249cd not found: ID does not exist" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.647928 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-tls\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648004 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-erlang-cookie\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648554 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648592 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-confd\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648622 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/161bd1ea-2276-4a95-b0ad-304cc807d13f-pod-info\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648650 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/161bd1ea-2276-4a95-b0ad-304cc807d13f-erlang-cookie-secret\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648667 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648713 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-server-conf\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648729 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjtd8\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-kube-api-access-wjtd8\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648771 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-config-data\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648789 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-plugins\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.648862 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-plugins-conf\") pod \"161bd1ea-2276-4a95-b0ad-304cc807d13f\" (UID: \"161bd1ea-2276-4a95-b0ad-304cc807d13f\") " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.649199 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.649555 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.654795 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.658771 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.660383 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-kube-api-access-wjtd8" (OuterVolumeSpecName: "kube-api-access-wjtd8") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "kube-api-access-wjtd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.662948 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/161bd1ea-2276-4a95-b0ad-304cc807d13f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.675637 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.675643 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/161bd1ea-2276-4a95-b0ad-304cc807d13f-pod-info" (OuterVolumeSpecName: "pod-info") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.697950 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-config-data" (OuterVolumeSpecName: "config-data") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.713406 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-server-conf" (OuterVolumeSpecName: "server-conf") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.763924 4922 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.763964 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjtd8\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-kube-api-access-wjtd8\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.763979 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.763990 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.764002 4922 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/161bd1ea-2276-4a95-b0ad-304cc807d13f-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.764012 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.764023 4922 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/161bd1ea-2276-4a95-b0ad-304cc807d13f-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.764034 4922 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/161bd1ea-2276-4a95-b0ad-304cc807d13f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.764069 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.789378 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.791786 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "161bd1ea-2276-4a95-b0ad-304cc807d13f" (UID: "161bd1ea-2276-4a95-b0ad-304cc807d13f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.866155 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/161bd1ea-2276-4a95-b0ad-304cc807d13f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:25 crc kubenswrapper[4922]: I1122 03:13:25.866403 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.547454 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.631035 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.642169 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.680161 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:13:26 crc kubenswrapper[4922]: E1122 03:13:26.680590 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="rabbitmq" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.680614 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="rabbitmq" Nov 22 03:13:26 crc kubenswrapper[4922]: E1122 03:13:26.680666 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="setup-container" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.680675 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="setup-container" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.680877 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" containerName="rabbitmq" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.681815 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.684458 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.691335 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.697679 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.697786 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.697901 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-28fkg" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.697681 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.698032 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.709405 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792000 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792058 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-config-data\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792124 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792147 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792228 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22048c2c-fb84-4f52-9868-c6f6074fab42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792252 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792275 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22048c2c-fb84-4f52-9868-c6f6074fab42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792297 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d9dm\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-kube-api-access-8d9dm\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792371 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.792437 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: E1122 03:13:26.807202 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11bc873c_bbc5_4033_ad7a_9569c2b6aa76.slice/crio-conmon-d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161bd1ea_2276_4a95_b0ad_304cc807d13f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161bd1ea_2276_4a95_b0ad_304cc807d13f.slice/crio-921d46bffde48af9b5e317203caf31711b2506c93ad522372d7afd13c5369d38\": RecentStats: unable to find data in memory cache]" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.895780 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.896754 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-config-data\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.896891 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.896961 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897063 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897131 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22048c2c-fb84-4f52-9868-c6f6074fab42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897201 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897277 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22048c2c-fb84-4f52-9868-c6f6074fab42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897349 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d9dm\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-kube-api-access-8d9dm\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897611 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.896717 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.897892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.898288 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.899161 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-config-data\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.901432 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.901668 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.903026 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.905640 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22048c2c-fb84-4f52-9868-c6f6074fab42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.905728 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22048c2c-fb84-4f52-9868-c6f6074fab42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.913993 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22048c2c-fb84-4f52-9868-c6f6074fab42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.916882 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.919492 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d9dm\" (UniqueName: \"kubernetes.io/projected/22048c2c-fb84-4f52-9868-c6f6074fab42-kube-api-access-8d9dm\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:26 crc kubenswrapper[4922]: I1122 03:13:26.929672 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"22048c2c-fb84-4f52-9868-c6f6074fab42\") " pod="openstack/rabbitmq-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.009755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.040532 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101414 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg82p\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-kube-api-access-xg82p\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-server-conf\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101532 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-tls\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101593 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-config-data\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101627 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-erlang-cookie\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101648 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-plugins\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101703 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-plugins-conf\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101748 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101770 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-confd\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101803 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-erlang-cookie-secret\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.101884 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-pod-info\") pod \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\" (UID: \"11bc873c-bbc5-4033-ad7a-9569c2b6aa76\") " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.106143 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.107551 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.108412 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.108785 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.109251 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-pod-info" (OuterVolumeSpecName: "pod-info") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.111958 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.122209 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.122453 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-kube-api-access-xg82p" (OuterVolumeSpecName: "kube-api-access-xg82p") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "kube-api-access-xg82p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.135563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-config-data" (OuterVolumeSpecName: "config-data") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204255 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg82p\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-kube-api-access-xg82p\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204409 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204469 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204542 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204597 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204674 4922 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204770 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204878 4922 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.204971 4922 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.208555 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-server-conf" (OuterVolumeSpecName: "server-conf") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.252171 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.293043 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "11bc873c-bbc5-4033-ad7a-9569c2b6aa76" (UID: "11bc873c-bbc5-4033-ad7a-9569c2b6aa76"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.306075 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.306097 4922 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.306107 4922 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11bc873c-bbc5-4033-ad7a-9569c2b6aa76-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.310030 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="161bd1ea-2276-4a95-b0ad-304cc807d13f" path="/var/lib/kubelet/pods/161bd1ea-2276-4a95-b0ad-304cc807d13f/volumes" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.544820 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.565525 4922 generic.go:334] "Generic (PLEG): container finished" podID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerID="d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131" exitCode=0 Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.565598 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11bc873c-bbc5-4033-ad7a-9569c2b6aa76","Type":"ContainerDied","Data":"d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131"} Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.565629 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"11bc873c-bbc5-4033-ad7a-9569c2b6aa76","Type":"ContainerDied","Data":"a878f53d0730a22e8676b2e0cecbd3e22ff2485d0786027ab83da96b9d2a037b"} Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.565650 4922 scope.go:117] "RemoveContainer" containerID="d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.565796 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.566780 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22048c2c-fb84-4f52-9868-c6f6074fab42","Type":"ContainerStarted","Data":"1409fc6625bd636594163fe1118353e072795fbf1da845726795ac69e51205c2"} Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.597681 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.610823 4922 scope.go:117] "RemoveContainer" containerID="f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.614889 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.626671 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:13:27 crc kubenswrapper[4922]: E1122 03:13:27.629421 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerName="rabbitmq" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.629456 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerName="rabbitmq" Nov 22 03:13:27 crc kubenswrapper[4922]: E1122 03:13:27.629492 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerName="setup-container" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.629503 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerName="setup-container" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.629716 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" containerName="rabbitmq" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.631193 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.633588 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.634875 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.635140 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.635287 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.635399 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lm5pq" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.637543 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.637679 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.637714 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.655681 4922 scope.go:117] "RemoveContainer" containerID="d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131" Nov 22 03:13:27 crc kubenswrapper[4922]: E1122 03:13:27.660237 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131\": container with ID starting with d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131 not found: ID does not exist" containerID="d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.660316 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131"} err="failed to get container status \"d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131\": rpc error: code = NotFound desc = could not find container \"d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131\": container with ID starting with d857507103cdb99da8d8bbf0dbe5fed4149199c4205cfc36c52e86df1b3de131 not found: ID does not exist" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.660348 4922 scope.go:117] "RemoveContainer" containerID="f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a" Nov 22 03:13:27 crc kubenswrapper[4922]: E1122 03:13:27.661330 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a\": container with ID starting with f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a not found: ID does not exist" containerID="f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.661417 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a"} err="failed to get container status \"f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a\": rpc error: code = NotFound desc = could not find container \"f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a\": container with ID starting with f7a1c30d9aae609ad3015b8af50f4e93aa51fc1a0aeee4139b3ab8b557827c6a not found: ID does not exist" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.713868 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.713932 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.713964 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57a51c5d-616f-49ef-b320-e3ad9238cf44-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.713999 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6k56\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-kube-api-access-k6k56\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714085 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714120 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714158 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714221 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57a51c5d-616f-49ef-b320-e3ad9238cf44-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714309 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.714351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.815878 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6k56\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-kube-api-access-k6k56\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.815921 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.815942 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.815975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816005 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57a51c5d-616f-49ef-b320-e3ad9238cf44-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816033 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816067 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816097 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816116 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.816175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57a51c5d-616f-49ef-b320-e3ad9238cf44-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.817051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.817447 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.817591 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.817801 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.818306 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.819568 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/57a51c5d-616f-49ef-b320-e3ad9238cf44-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.820369 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/57a51c5d-616f-49ef-b320-e3ad9238cf44-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.820374 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.822504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.826086 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/57a51c5d-616f-49ef-b320-e3ad9238cf44-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.836318 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6k56\" (UniqueName: \"kubernetes.io/projected/57a51c5d-616f-49ef-b320-e3ad9238cf44-kube-api-access-k6k56\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.866738 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"57a51c5d-616f-49ef-b320-e3ad9238cf44\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:27 crc kubenswrapper[4922]: I1122 03:13:27.976694 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:13:28 crc kubenswrapper[4922]: I1122 03:13:28.600438 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 03:13:29 crc kubenswrapper[4922]: I1122 03:13:29.315681 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11bc873c-bbc5-4033-ad7a-9569c2b6aa76" path="/var/lib/kubelet/pods/11bc873c-bbc5-4033-ad7a-9569c2b6aa76/volumes" Nov 22 03:13:29 crc kubenswrapper[4922]: I1122 03:13:29.597542 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"57a51c5d-616f-49ef-b320-e3ad9238cf44","Type":"ContainerStarted","Data":"9a7659731cbeefeaad2fa494aa3d10a077fa0e98b43e5ee8a70b1630cd0fdb57"} Nov 22 03:13:29 crc kubenswrapper[4922]: I1122 03:13:29.599681 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22048c2c-fb84-4f52-9868-c6f6074fab42","Type":"ContainerStarted","Data":"16482de708a19472c616c8b2f85c6924bd85948fb2c21b4faf7ae9fe39902401"} Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.626797 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"57a51c5d-616f-49ef-b320-e3ad9238cf44","Type":"ContainerStarted","Data":"ca69ebfa3af28c109835979389d08dd50e6bfcfa018a4fd7e40dabd75f7f2ace"} Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.761040 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-98d42"] Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.762807 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.765766 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.777617 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-98d42"] Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.896783 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.896881 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.897199 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-config\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.897391 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.897573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdtzb\" (UniqueName: \"kubernetes.io/projected/12612d75-bb67-40b7-a673-6a013bf15d76-kube-api-access-jdtzb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:31 crc kubenswrapper[4922]: I1122 03:13:31.897681 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:31.999930 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.000001 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.000099 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-config\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.000118 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.000167 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtzb\" (UniqueName: \"kubernetes.io/projected/12612d75-bb67-40b7-a673-6a013bf15d76-kube-api-access-jdtzb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.000206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.001557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.001688 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.001927 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-config\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.002095 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.002920 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.034656 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdtzb\" (UniqueName: \"kubernetes.io/projected/12612d75-bb67-40b7-a673-6a013bf15d76-kube-api-access-jdtzb\") pod \"dnsmasq-dns-6447ccbd8f-98d42\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.086361 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:32 crc kubenswrapper[4922]: I1122 03:13:32.641918 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-98d42"] Nov 22 03:13:32 crc kubenswrapper[4922]: W1122 03:13:32.646539 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12612d75_bb67_40b7_a673_6a013bf15d76.slice/crio-9d06a351d76220ec3e578d710c29cdf1a892522be494692e9e1f3722cdbfc49a WatchSource:0}: Error finding container 9d06a351d76220ec3e578d710c29cdf1a892522be494692e9e1f3722cdbfc49a: Status 404 returned error can't find the container with id 9d06a351d76220ec3e578d710c29cdf1a892522be494692e9e1f3722cdbfc49a Nov 22 03:13:33 crc kubenswrapper[4922]: I1122 03:13:33.654189 4922 generic.go:334] "Generic (PLEG): container finished" podID="12612d75-bb67-40b7-a673-6a013bf15d76" containerID="8cd4d75099ff55643636c4189142bd6f1db19217dd8a8309ed492302320275b5" exitCode=0 Nov 22 03:13:33 crc kubenswrapper[4922]: I1122 03:13:33.654309 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" event={"ID":"12612d75-bb67-40b7-a673-6a013bf15d76","Type":"ContainerDied","Data":"8cd4d75099ff55643636c4189142bd6f1db19217dd8a8309ed492302320275b5"} Nov 22 03:13:33 crc kubenswrapper[4922]: I1122 03:13:33.654610 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" event={"ID":"12612d75-bb67-40b7-a673-6a013bf15d76","Type":"ContainerStarted","Data":"9d06a351d76220ec3e578d710c29cdf1a892522be494692e9e1f3722cdbfc49a"} Nov 22 03:13:34 crc kubenswrapper[4922]: I1122 03:13:34.674511 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" event={"ID":"12612d75-bb67-40b7-a673-6a013bf15d76","Type":"ContainerStarted","Data":"57c59b65bb8f39a7893e4223dc0637ad10e53bf8615e0dd01818c82e44b5c37b"} Nov 22 03:13:34 crc kubenswrapper[4922]: I1122 03:13:34.676186 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:34 crc kubenswrapper[4922]: I1122 03:13:34.719243 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" podStartSLOduration=3.719212815 podStartE2EDuration="3.719212815s" podCreationTimestamp="2025-11-22 03:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:34.711547212 +0000 UTC m=+1250.750069144" watchObservedRunningTime="2025-11-22 03:13:34.719212815 +0000 UTC m=+1250.757734747" Nov 22 03:13:41 crc kubenswrapper[4922]: I1122 03:13:41.110115 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:13:41 crc kubenswrapper[4922]: I1122 03:13:41.110875 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.089579 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.189306 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-zfbgm"] Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.190067 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerName="dnsmasq-dns" containerID="cri-o://a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def" gracePeriod=10 Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.419369 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-hrkcn"] Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.421176 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.519386 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-hrkcn"] Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.526722 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.526807 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxpfp\" (UniqueName: \"kubernetes.io/projected/68f450d1-62c7-4e64-b820-d4ee357f7403-kube-api-access-rxpfp\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.527326 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.527404 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.527459 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.527492 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-config\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.631674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxpfp\" (UniqueName: \"kubernetes.io/projected/68f450d1-62c7-4e64-b820-d4ee357f7403-kube-api-access-rxpfp\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.631805 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.631906 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.631945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.632006 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-config\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.632032 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.632863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.632913 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.633658 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.634361 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-config\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.634706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.657951 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxpfp\" (UniqueName: \"kubernetes.io/projected/68f450d1-62c7-4e64-b820-d4ee357f7403-kube-api-access-rxpfp\") pod \"dnsmasq-dns-864d5fc68c-hrkcn\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.750298 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.778983 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerID="a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def" exitCode=0 Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.779025 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" event={"ID":"2a8adb5c-9b6c-4398-96b9-e67328810310","Type":"ContainerDied","Data":"a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def"} Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.779052 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" event={"ID":"2a8adb5c-9b6c-4398-96b9-e67328810310","Type":"ContainerDied","Data":"c0f863684c44696882b2eb5cd9bc7e1d7ccb192dcd8e5f741ff9c32d82b3e5b5"} Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.779070 4922 scope.go:117] "RemoveContainer" containerID="a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.779187 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-zfbgm" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.794659 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.815102 4922 scope.go:117] "RemoveContainer" containerID="7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.835180 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-nb\") pod \"2a8adb5c-9b6c-4398-96b9-e67328810310\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.835267 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-config\") pod \"2a8adb5c-9b6c-4398-96b9-e67328810310\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.835373 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98zl\" (UniqueName: \"kubernetes.io/projected/2a8adb5c-9b6c-4398-96b9-e67328810310-kube-api-access-s98zl\") pod \"2a8adb5c-9b6c-4398-96b9-e67328810310\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.835462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-sb\") pod \"2a8adb5c-9b6c-4398-96b9-e67328810310\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.835529 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-dns-svc\") pod \"2a8adb5c-9b6c-4398-96b9-e67328810310\" (UID: \"2a8adb5c-9b6c-4398-96b9-e67328810310\") " Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.838734 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8adb5c-9b6c-4398-96b9-e67328810310-kube-api-access-s98zl" (OuterVolumeSpecName: "kube-api-access-s98zl") pod "2a8adb5c-9b6c-4398-96b9-e67328810310" (UID: "2a8adb5c-9b6c-4398-96b9-e67328810310"). InnerVolumeSpecName "kube-api-access-s98zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.843776 4922 scope.go:117] "RemoveContainer" containerID="a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def" Nov 22 03:13:42 crc kubenswrapper[4922]: E1122 03:13:42.844356 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def\": container with ID starting with a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def not found: ID does not exist" containerID="a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.844403 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def"} err="failed to get container status \"a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def\": rpc error: code = NotFound desc = could not find container \"a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def\": container with ID starting with a52054879f2c54059a37a7366f13a0f7d4f62a1deeff77c1aa95b53477311def not found: ID does not exist" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.844433 4922 scope.go:117] "RemoveContainer" containerID="7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6" Nov 22 03:13:42 crc kubenswrapper[4922]: E1122 03:13:42.844700 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6\": container with ID starting with 7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6 not found: ID does not exist" containerID="7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.844726 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6"} err="failed to get container status \"7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6\": rpc error: code = NotFound desc = could not find container \"7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6\": container with ID starting with 7ec0b901e0a1278c07ec70b9e029d806c1ba89f2338031f47c0752b2e4ee62e6 not found: ID does not exist" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.886693 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a8adb5c-9b6c-4398-96b9-e67328810310" (UID: "2a8adb5c-9b6c-4398-96b9-e67328810310"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.889879 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a8adb5c-9b6c-4398-96b9-e67328810310" (UID: "2a8adb5c-9b6c-4398-96b9-e67328810310"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.890810 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a8adb5c-9b6c-4398-96b9-e67328810310" (UID: "2a8adb5c-9b6c-4398-96b9-e67328810310"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.894738 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-config" (OuterVolumeSpecName: "config") pod "2a8adb5c-9b6c-4398-96b9-e67328810310" (UID: "2a8adb5c-9b6c-4398-96b9-e67328810310"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.941811 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s98zl\" (UniqueName: \"kubernetes.io/projected/2a8adb5c-9b6c-4398-96b9-e67328810310-kube-api-access-s98zl\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.941869 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.941883 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.941894 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:42 crc kubenswrapper[4922]: I1122 03:13:42.941904 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a8adb5c-9b6c-4398-96b9-e67328810310-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.115133 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-zfbgm"] Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.126258 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-zfbgm"] Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.250820 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-hrkcn"] Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.312930 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" path="/var/lib/kubelet/pods/2a8adb5c-9b6c-4398-96b9-e67328810310/volumes" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.606007 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk"] Nov 22 03:13:43 crc kubenswrapper[4922]: E1122 03:13:43.606513 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerName="init" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.606624 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerName="init" Nov 22 03:13:43 crc kubenswrapper[4922]: E1122 03:13:43.606741 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerName="dnsmasq-dns" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.606821 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerName="dnsmasq-dns" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.607115 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8adb5c-9b6c-4398-96b9-e67328810310" containerName="dnsmasq-dns" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.607933 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.610412 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.610871 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.611118 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.611554 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.620719 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk"] Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.694886 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpr4h\" (UniqueName: \"kubernetes.io/projected/688b25c4-de7f-4adb-bc7e-760847d2a6e2-kube-api-access-rpr4h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.695000 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.695029 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.695287 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.789461 4922 generic.go:334] "Generic (PLEG): container finished" podID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerID="ab24de2828458b3ffbbb704331a7d54ce33344abd83938dd745d1f6b175fe0dd" exitCode=0 Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.789627 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" event={"ID":"68f450d1-62c7-4e64-b820-d4ee357f7403","Type":"ContainerDied","Data":"ab24de2828458b3ffbbb704331a7d54ce33344abd83938dd745d1f6b175fe0dd"} Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.789836 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" event={"ID":"68f450d1-62c7-4e64-b820-d4ee357f7403","Type":"ContainerStarted","Data":"23f321a40922800280b110f5b0314b14023d5f88b5c73e390e12ce19d690dd5e"} Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.797450 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpr4h\" (UniqueName: \"kubernetes.io/projected/688b25c4-de7f-4adb-bc7e-760847d2a6e2-kube-api-access-rpr4h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.797591 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.797627 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.797686 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.801780 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.804892 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.804940 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.827319 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpr4h\" (UniqueName: \"kubernetes.io/projected/688b25c4-de7f-4adb-bc7e-760847d2a6e2-kube-api-access-rpr4h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:43 crc kubenswrapper[4922]: I1122 03:13:43.969747 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:13:44 crc kubenswrapper[4922]: I1122 03:13:44.520744 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk"] Nov 22 03:13:44 crc kubenswrapper[4922]: I1122 03:13:44.800930 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" event={"ID":"688b25c4-de7f-4adb-bc7e-760847d2a6e2","Type":"ContainerStarted","Data":"b87a301c0f854e32b2fd4df74c81a0199f6ae433736c9a8958cea8322063ef42"} Nov 22 03:13:44 crc kubenswrapper[4922]: I1122 03:13:44.803599 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" event={"ID":"68f450d1-62c7-4e64-b820-d4ee357f7403","Type":"ContainerStarted","Data":"80f71273258e6772ff1eec68461ae3e742b9b0d6958dce4c4117699cc6b0d4d1"} Nov 22 03:13:44 crc kubenswrapper[4922]: I1122 03:13:44.803801 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:44 crc kubenswrapper[4922]: I1122 03:13:44.832485 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" podStartSLOduration=2.832463089 podStartE2EDuration="2.832463089s" podCreationTimestamp="2025-11-22 03:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:13:44.829283613 +0000 UTC m=+1260.867805505" watchObservedRunningTime="2025-11-22 03:13:44.832463089 +0000 UTC m=+1260.870985011" Nov 22 03:13:52 crc kubenswrapper[4922]: I1122 03:13:52.796081 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:13:52 crc kubenswrapper[4922]: I1122 03:13:52.859796 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-98d42"] Nov 22 03:13:52 crc kubenswrapper[4922]: I1122 03:13:52.860070 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" containerName="dnsmasq-dns" containerID="cri-o://57c59b65bb8f39a7893e4223dc0637ad10e53bf8615e0dd01818c82e44b5c37b" gracePeriod=10 Nov 22 03:13:54 crc kubenswrapper[4922]: I1122 03:13:54.928956 4922 generic.go:334] "Generic (PLEG): container finished" podID="12612d75-bb67-40b7-a673-6a013bf15d76" containerID="57c59b65bb8f39a7893e4223dc0637ad10e53bf8615e0dd01818c82e44b5c37b" exitCode=0 Nov 22 03:13:54 crc kubenswrapper[4922]: I1122 03:13:54.929079 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" event={"ID":"12612d75-bb67-40b7-a673-6a013bf15d76","Type":"ContainerDied","Data":"57c59b65bb8f39a7893e4223dc0637ad10e53bf8615e0dd01818c82e44b5c37b"} Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.822962 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.932293 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-nb\") pod \"12612d75-bb67-40b7-a673-6a013bf15d76\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.932425 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-dns-svc\") pod \"12612d75-bb67-40b7-a673-6a013bf15d76\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.932533 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-openstack-edpm-ipam\") pod \"12612d75-bb67-40b7-a673-6a013bf15d76\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.932613 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-config\") pod \"12612d75-bb67-40b7-a673-6a013bf15d76\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.932773 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdtzb\" (UniqueName: \"kubernetes.io/projected/12612d75-bb67-40b7-a673-6a013bf15d76-kube-api-access-jdtzb\") pod \"12612d75-bb67-40b7-a673-6a013bf15d76\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.932828 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-sb\") pod \"12612d75-bb67-40b7-a673-6a013bf15d76\" (UID: \"12612d75-bb67-40b7-a673-6a013bf15d76\") " Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.940330 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" event={"ID":"12612d75-bb67-40b7-a673-6a013bf15d76","Type":"ContainerDied","Data":"9d06a351d76220ec3e578d710c29cdf1a892522be494692e9e1f3722cdbfc49a"} Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.940433 4922 scope.go:117] "RemoveContainer" containerID="57c59b65bb8f39a7893e4223dc0637ad10e53bf8615e0dd01818c82e44b5c37b" Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.940631 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-98d42" Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.942109 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12612d75-bb67-40b7-a673-6a013bf15d76-kube-api-access-jdtzb" (OuterVolumeSpecName: "kube-api-access-jdtzb") pod "12612d75-bb67-40b7-a673-6a013bf15d76" (UID: "12612d75-bb67-40b7-a673-6a013bf15d76"). InnerVolumeSpecName "kube-api-access-jdtzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:13:55 crc kubenswrapper[4922]: I1122 03:13:55.999196 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "12612d75-bb67-40b7-a673-6a013bf15d76" (UID: "12612d75-bb67-40b7-a673-6a013bf15d76"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.000560 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12612d75-bb67-40b7-a673-6a013bf15d76" (UID: "12612d75-bb67-40b7-a673-6a013bf15d76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.016924 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-config" (OuterVolumeSpecName: "config") pod "12612d75-bb67-40b7-a673-6a013bf15d76" (UID: "12612d75-bb67-40b7-a673-6a013bf15d76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.019221 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12612d75-bb67-40b7-a673-6a013bf15d76" (UID: "12612d75-bb67-40b7-a673-6a013bf15d76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.024505 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12612d75-bb67-40b7-a673-6a013bf15d76" (UID: "12612d75-bb67-40b7-a673-6a013bf15d76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.028658 4922 scope.go:117] "RemoveContainer" containerID="8cd4d75099ff55643636c4189142bd6f1db19217dd8a8309ed492302320275b5" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.035624 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdtzb\" (UniqueName: \"kubernetes.io/projected/12612d75-bb67-40b7-a673-6a013bf15d76-kube-api-access-jdtzb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.035656 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.035668 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.035680 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.035693 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.035706 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12612d75-bb67-40b7-a673-6a013bf15d76-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.304420 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-98d42"] Nov 22 03:13:56 crc kubenswrapper[4922]: I1122 03:13:56.312587 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-98d42"] Nov 22 03:13:57 crc kubenswrapper[4922]: I1122 03:13:57.322381 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" path="/var/lib/kubelet/pods/12612d75-bb67-40b7-a673-6a013bf15d76/volumes" Nov 22 03:13:57 crc kubenswrapper[4922]: I1122 03:13:57.675814 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:13:57 crc kubenswrapper[4922]: I1122 03:13:57.968452 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" event={"ID":"688b25c4-de7f-4adb-bc7e-760847d2a6e2","Type":"ContainerStarted","Data":"99a44317093e19b689586028026fa9423290de32beb4f25617229d9042831666"} Nov 22 03:13:58 crc kubenswrapper[4922]: I1122 03:13:58.003000 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" podStartSLOduration=1.846198029 podStartE2EDuration="15.00293876s" podCreationTimestamp="2025-11-22 03:13:43 +0000 UTC" firstStartedPulling="2025-11-22 03:13:44.515018446 +0000 UTC m=+1260.553540338" lastFinishedPulling="2025-11-22 03:13:57.671759187 +0000 UTC m=+1273.710281069" observedRunningTime="2025-11-22 03:13:57.992182402 +0000 UTC m=+1274.030704324" watchObservedRunningTime="2025-11-22 03:13:58.00293876 +0000 UTC m=+1274.041460662" Nov 22 03:14:03 crc kubenswrapper[4922]: I1122 03:14:03.037385 4922 generic.go:334] "Generic (PLEG): container finished" podID="22048c2c-fb84-4f52-9868-c6f6074fab42" containerID="16482de708a19472c616c8b2f85c6924bd85948fb2c21b4faf7ae9fe39902401" exitCode=0 Nov 22 03:14:03 crc kubenswrapper[4922]: I1122 03:14:03.037505 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22048c2c-fb84-4f52-9868-c6f6074fab42","Type":"ContainerDied","Data":"16482de708a19472c616c8b2f85c6924bd85948fb2c21b4faf7ae9fe39902401"} Nov 22 03:14:04 crc kubenswrapper[4922]: I1122 03:14:04.050105 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"22048c2c-fb84-4f52-9868-c6f6074fab42","Type":"ContainerStarted","Data":"ea7dea4b2c8d18eef61a83b754e73c3efd38328d1bde4758bb65d75c70fabe56"} Nov 22 03:14:04 crc kubenswrapper[4922]: I1122 03:14:04.051039 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 03:14:04 crc kubenswrapper[4922]: I1122 03:14:04.078380 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.078362979 podStartE2EDuration="38.078362979s" podCreationTimestamp="2025-11-22 03:13:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:14:04.073079622 +0000 UTC m=+1280.111601524" watchObservedRunningTime="2025-11-22 03:14:04.078362979 +0000 UTC m=+1280.116884871" Nov 22 03:14:05 crc kubenswrapper[4922]: I1122 03:14:05.062739 4922 generic.go:334] "Generic (PLEG): container finished" podID="57a51c5d-616f-49ef-b320-e3ad9238cf44" containerID="ca69ebfa3af28c109835979389d08dd50e6bfcfa018a4fd7e40dabd75f7f2ace" exitCode=0 Nov 22 03:14:05 crc kubenswrapper[4922]: I1122 03:14:05.062827 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"57a51c5d-616f-49ef-b320-e3ad9238cf44","Type":"ContainerDied","Data":"ca69ebfa3af28c109835979389d08dd50e6bfcfa018a4fd7e40dabd75f7f2ace"} Nov 22 03:14:06 crc kubenswrapper[4922]: I1122 03:14:06.078303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"57a51c5d-616f-49ef-b320-e3ad9238cf44","Type":"ContainerStarted","Data":"e4a9319551af24ea5d4295c7d2c775fe0b13bf47f902f3004b79c6085860d86f"} Nov 22 03:14:06 crc kubenswrapper[4922]: I1122 03:14:06.078994 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:06 crc kubenswrapper[4922]: I1122 03:14:06.112644 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.112626973 podStartE2EDuration="39.112626973s" podCreationTimestamp="2025-11-22 03:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:14:06.11039632 +0000 UTC m=+1282.148918222" watchObservedRunningTime="2025-11-22 03:14:06.112626973 +0000 UTC m=+1282.151148865" Nov 22 03:14:10 crc kubenswrapper[4922]: I1122 03:14:10.121424 4922 generic.go:334] "Generic (PLEG): container finished" podID="688b25c4-de7f-4adb-bc7e-760847d2a6e2" containerID="99a44317093e19b689586028026fa9423290de32beb4f25617229d9042831666" exitCode=0 Nov 22 03:14:10 crc kubenswrapper[4922]: I1122 03:14:10.121563 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" event={"ID":"688b25c4-de7f-4adb-bc7e-760847d2a6e2","Type":"ContainerDied","Data":"99a44317093e19b689586028026fa9423290de32beb4f25617229d9042831666"} Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.110242 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.110641 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.579313 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.672211 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-ssh-key\") pod \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.672295 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-repo-setup-combined-ca-bundle\") pod \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.672340 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-inventory\") pod \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.672462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpr4h\" (UniqueName: \"kubernetes.io/projected/688b25c4-de7f-4adb-bc7e-760847d2a6e2-kube-api-access-rpr4h\") pod \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\" (UID: \"688b25c4-de7f-4adb-bc7e-760847d2a6e2\") " Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.684687 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "688b25c4-de7f-4adb-bc7e-760847d2a6e2" (UID: "688b25c4-de7f-4adb-bc7e-760847d2a6e2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.684783 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688b25c4-de7f-4adb-bc7e-760847d2a6e2-kube-api-access-rpr4h" (OuterVolumeSpecName: "kube-api-access-rpr4h") pod "688b25c4-de7f-4adb-bc7e-760847d2a6e2" (UID: "688b25c4-de7f-4adb-bc7e-760847d2a6e2"). InnerVolumeSpecName "kube-api-access-rpr4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.704867 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-inventory" (OuterVolumeSpecName: "inventory") pod "688b25c4-de7f-4adb-bc7e-760847d2a6e2" (UID: "688b25c4-de7f-4adb-bc7e-760847d2a6e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.712012 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "688b25c4-de7f-4adb-bc7e-760847d2a6e2" (UID: "688b25c4-de7f-4adb-bc7e-760847d2a6e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.774829 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.774900 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.774918 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/688b25c4-de7f-4adb-bc7e-760847d2a6e2-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:11 crc kubenswrapper[4922]: I1122 03:14:11.774931 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpr4h\" (UniqueName: \"kubernetes.io/projected/688b25c4-de7f-4adb-bc7e-760847d2a6e2-kube-api-access-rpr4h\") on node \"crc\" DevicePath \"\"" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.145993 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" event={"ID":"688b25c4-de7f-4adb-bc7e-760847d2a6e2","Type":"ContainerDied","Data":"b87a301c0f854e32b2fd4df74c81a0199f6ae433736c9a8958cea8322063ef42"} Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.146042 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b87a301c0f854e32b2fd4df74c81a0199f6ae433736c9a8958cea8322063ef42" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.146072 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.254230 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5"] Nov 22 03:14:12 crc kubenswrapper[4922]: E1122 03:14:12.254739 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" containerName="init" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.254760 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" containerName="init" Nov 22 03:14:12 crc kubenswrapper[4922]: E1122 03:14:12.254780 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" containerName="dnsmasq-dns" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.254789 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" containerName="dnsmasq-dns" Nov 22 03:14:12 crc kubenswrapper[4922]: E1122 03:14:12.254819 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b25c4-de7f-4adb-bc7e-760847d2a6e2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.254828 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b25c4-de7f-4adb-bc7e-760847d2a6e2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.255086 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="688b25c4-de7f-4adb-bc7e-760847d2a6e2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.255112 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="12612d75-bb67-40b7-a673-6a013bf15d76" containerName="dnsmasq-dns" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.255860 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.265314 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5"] Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.266300 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.266739 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.267721 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.267940 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.385600 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc2j\" (UniqueName: \"kubernetes.io/projected/cd426f86-ec43-4a7a-bde1-0430f01503f6-kube-api-access-zqc2j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.385697 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.385806 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.386811 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.488891 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.489665 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc2j\" (UniqueName: \"kubernetes.io/projected/cd426f86-ec43-4a7a-bde1-0430f01503f6-kube-api-access-zqc2j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.489705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.489747 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.494287 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.494578 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.494757 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.525507 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc2j\" (UniqueName: \"kubernetes.io/projected/cd426f86-ec43-4a7a-bde1-0430f01503f6-kube-api-access-zqc2j\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:12 crc kubenswrapper[4922]: I1122 03:14:12.582358 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:14:13 crc kubenswrapper[4922]: I1122 03:14:13.208895 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5"] Nov 22 03:14:14 crc kubenswrapper[4922]: I1122 03:14:14.168475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" event={"ID":"cd426f86-ec43-4a7a-bde1-0430f01503f6","Type":"ContainerStarted","Data":"c26720a596a2ae976d134fbbc1aec3552ffeaa26bc38f1a607940d6c41abb28f"} Nov 22 03:14:14 crc kubenswrapper[4922]: I1122 03:14:14.168897 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" event={"ID":"cd426f86-ec43-4a7a-bde1-0430f01503f6","Type":"ContainerStarted","Data":"78d27f8823d5761bba9e2e878b8d09c5162e8d5bb63844a14c7f95391a4b2adb"} Nov 22 03:14:14 crc kubenswrapper[4922]: I1122 03:14:14.196736 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" podStartSLOduration=1.681329554 podStartE2EDuration="2.196719645s" podCreationTimestamp="2025-11-22 03:14:12 +0000 UTC" firstStartedPulling="2025-11-22 03:14:13.221859813 +0000 UTC m=+1289.260381715" lastFinishedPulling="2025-11-22 03:14:13.737249894 +0000 UTC m=+1289.775771806" observedRunningTime="2025-11-22 03:14:14.193775295 +0000 UTC m=+1290.232297207" watchObservedRunningTime="2025-11-22 03:14:14.196719645 +0000 UTC m=+1290.235241557" Nov 22 03:14:17 crc kubenswrapper[4922]: I1122 03:14:17.015202 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 03:14:17 crc kubenswrapper[4922]: I1122 03:14:17.981126 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.109581 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.110269 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.110339 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.111201 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40d76ae6e06f9784b7400ca69b8cc35e187346fc7004ab985de1150eb16fab3d"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.111300 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://40d76ae6e06f9784b7400ca69b8cc35e187346fc7004ab985de1150eb16fab3d" gracePeriod=600 Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.501470 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="40d76ae6e06f9784b7400ca69b8cc35e187346fc7004ab985de1150eb16fab3d" exitCode=0 Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.501533 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"40d76ae6e06f9784b7400ca69b8cc35e187346fc7004ab985de1150eb16fab3d"} Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.502081 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb"} Nov 22 03:14:41 crc kubenswrapper[4922]: I1122 03:14:41.502113 4922 scope.go:117] "RemoveContainer" containerID="a719d1a6509057cf085f1c2768ec28cb47fdbdd817caffc2f3d5d452e6b5e16a" Nov 22 03:14:55 crc kubenswrapper[4922]: I1122 03:14:55.205979 4922 scope.go:117] "RemoveContainer" containerID="74c791f0c27f0f6c251f54a96fef1c7a90cf8c8d566e575f4f5c5deeedd77935" Nov 22 03:14:55 crc kubenswrapper[4922]: I1122 03:14:55.264440 4922 scope.go:117] "RemoveContainer" containerID="f2132549d2f271f23dcd0651695be5dc9f87a2c1059a8e1a4eef0a8e739eef51" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.173407 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws"] Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.176363 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.179249 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.179582 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.201231 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws"] Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.246821 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/47eee278-a04a-4e50-98f2-98db1b7caa21-kube-api-access-l42vf\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.247241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47eee278-a04a-4e50-98f2-98db1b7caa21-secret-volume\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.247351 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47eee278-a04a-4e50-98f2-98db1b7caa21-config-volume\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.349898 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/47eee278-a04a-4e50-98f2-98db1b7caa21-kube-api-access-l42vf\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.349997 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47eee278-a04a-4e50-98f2-98db1b7caa21-secret-volume\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.350183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47eee278-a04a-4e50-98f2-98db1b7caa21-config-volume\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.353491 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47eee278-a04a-4e50-98f2-98db1b7caa21-config-volume\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.357426 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47eee278-a04a-4e50-98f2-98db1b7caa21-secret-volume\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.386325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/47eee278-a04a-4e50-98f2-98db1b7caa21-kube-api-access-l42vf\") pod \"collect-profiles-29396355-956ws\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.506899 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:00 crc kubenswrapper[4922]: I1122 03:15:00.901056 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws"] Nov 22 03:15:01 crc kubenswrapper[4922]: I1122 03:15:01.795966 4922 generic.go:334] "Generic (PLEG): container finished" podID="47eee278-a04a-4e50-98f2-98db1b7caa21" containerID="e3523d1601cfec04fa711b7fb5b039fd644edf4780a65d22606723f2d1f94df2" exitCode=0 Nov 22 03:15:01 crc kubenswrapper[4922]: I1122 03:15:01.796028 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" event={"ID":"47eee278-a04a-4e50-98f2-98db1b7caa21","Type":"ContainerDied","Data":"e3523d1601cfec04fa711b7fb5b039fd644edf4780a65d22606723f2d1f94df2"} Nov 22 03:15:01 crc kubenswrapper[4922]: I1122 03:15:01.796122 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" event={"ID":"47eee278-a04a-4e50-98f2-98db1b7caa21","Type":"ContainerStarted","Data":"6734d36a3c7cb797247bb78b495f17b243b7f87302333ef6fdc0046986cce44a"} Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.151866 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.308898 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47eee278-a04a-4e50-98f2-98db1b7caa21-secret-volume\") pod \"47eee278-a04a-4e50-98f2-98db1b7caa21\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.308982 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/47eee278-a04a-4e50-98f2-98db1b7caa21-kube-api-access-l42vf\") pod \"47eee278-a04a-4e50-98f2-98db1b7caa21\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.309335 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47eee278-a04a-4e50-98f2-98db1b7caa21-config-volume\") pod \"47eee278-a04a-4e50-98f2-98db1b7caa21\" (UID: \"47eee278-a04a-4e50-98f2-98db1b7caa21\") " Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.310381 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47eee278-a04a-4e50-98f2-98db1b7caa21-config-volume" (OuterVolumeSpecName: "config-volume") pod "47eee278-a04a-4e50-98f2-98db1b7caa21" (UID: "47eee278-a04a-4e50-98f2-98db1b7caa21"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.311159 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47eee278-a04a-4e50-98f2-98db1b7caa21-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.318000 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47eee278-a04a-4e50-98f2-98db1b7caa21-kube-api-access-l42vf" (OuterVolumeSpecName: "kube-api-access-l42vf") pod "47eee278-a04a-4e50-98f2-98db1b7caa21" (UID: "47eee278-a04a-4e50-98f2-98db1b7caa21"). InnerVolumeSpecName "kube-api-access-l42vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.320172 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47eee278-a04a-4e50-98f2-98db1b7caa21-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47eee278-a04a-4e50-98f2-98db1b7caa21" (UID: "47eee278-a04a-4e50-98f2-98db1b7caa21"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.413381 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47eee278-a04a-4e50-98f2-98db1b7caa21-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.413434 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/47eee278-a04a-4e50-98f2-98db1b7caa21-kube-api-access-l42vf\") on node \"crc\" DevicePath \"\"" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.819594 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" event={"ID":"47eee278-a04a-4e50-98f2-98db1b7caa21","Type":"ContainerDied","Data":"6734d36a3c7cb797247bb78b495f17b243b7f87302333ef6fdc0046986cce44a"} Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.820060 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6734d36a3c7cb797247bb78b495f17b243b7f87302333ef6fdc0046986cce44a" Nov 22 03:15:03 crc kubenswrapper[4922]: I1122 03:15:03.819682 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws" Nov 22 03:15:55 crc kubenswrapper[4922]: I1122 03:15:55.385824 4922 scope.go:117] "RemoveContainer" containerID="4923f77f1af12b4b9198a0936369ff3d3a997d7c6abfc51d014cac2563974f0a" Nov 22 03:15:55 crc kubenswrapper[4922]: I1122 03:15:55.447902 4922 scope.go:117] "RemoveContainer" containerID="f6ed80e0dd0701393dc9edbf59d065ba69eed83068bd4025f0a2bddc44448c09" Nov 22 03:16:41 crc kubenswrapper[4922]: I1122 03:16:41.110308 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:16:41 crc kubenswrapper[4922]: I1122 03:16:41.111016 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:17:11 crc kubenswrapper[4922]: I1122 03:17:11.110445 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:17:11 crc kubenswrapper[4922]: I1122 03:17:11.111402 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:17:25 crc kubenswrapper[4922]: I1122 03:17:25.504135 4922 generic.go:334] "Generic (PLEG): container finished" podID="cd426f86-ec43-4a7a-bde1-0430f01503f6" containerID="c26720a596a2ae976d134fbbc1aec3552ffeaa26bc38f1a607940d6c41abb28f" exitCode=0 Nov 22 03:17:25 crc kubenswrapper[4922]: I1122 03:17:25.504254 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" event={"ID":"cd426f86-ec43-4a7a-bde1-0430f01503f6","Type":"ContainerDied","Data":"c26720a596a2ae976d134fbbc1aec3552ffeaa26bc38f1a607940d6c41abb28f"} Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.003196 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.128038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-inventory\") pod \"cd426f86-ec43-4a7a-bde1-0430f01503f6\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.128086 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqc2j\" (UniqueName: \"kubernetes.io/projected/cd426f86-ec43-4a7a-bde1-0430f01503f6-kube-api-access-zqc2j\") pod \"cd426f86-ec43-4a7a-bde1-0430f01503f6\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.128135 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-bootstrap-combined-ca-bundle\") pod \"cd426f86-ec43-4a7a-bde1-0430f01503f6\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.128186 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-ssh-key\") pod \"cd426f86-ec43-4a7a-bde1-0430f01503f6\" (UID: \"cd426f86-ec43-4a7a-bde1-0430f01503f6\") " Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.136733 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cd426f86-ec43-4a7a-bde1-0430f01503f6" (UID: "cd426f86-ec43-4a7a-bde1-0430f01503f6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.137246 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd426f86-ec43-4a7a-bde1-0430f01503f6-kube-api-access-zqc2j" (OuterVolumeSpecName: "kube-api-access-zqc2j") pod "cd426f86-ec43-4a7a-bde1-0430f01503f6" (UID: "cd426f86-ec43-4a7a-bde1-0430f01503f6"). InnerVolumeSpecName "kube-api-access-zqc2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.168986 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-inventory" (OuterVolumeSpecName: "inventory") pod "cd426f86-ec43-4a7a-bde1-0430f01503f6" (UID: "cd426f86-ec43-4a7a-bde1-0430f01503f6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.176175 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cd426f86-ec43-4a7a-bde1-0430f01503f6" (UID: "cd426f86-ec43-4a7a-bde1-0430f01503f6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.230886 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.230924 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqc2j\" (UniqueName: \"kubernetes.io/projected/cd426f86-ec43-4a7a-bde1-0430f01503f6-kube-api-access-zqc2j\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.230939 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.230954 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cd426f86-ec43-4a7a-bde1-0430f01503f6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.526175 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" event={"ID":"cd426f86-ec43-4a7a-bde1-0430f01503f6","Type":"ContainerDied","Data":"78d27f8823d5761bba9e2e878b8d09c5162e8d5bb63844a14c7f95391a4b2adb"} Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.526238 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d27f8823d5761bba9e2e878b8d09c5162e8d5bb63844a14c7f95391a4b2adb" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.526244 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.650245 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m"] Nov 22 03:17:27 crc kubenswrapper[4922]: E1122 03:17:27.650785 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47eee278-a04a-4e50-98f2-98db1b7caa21" containerName="collect-profiles" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.650813 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="47eee278-a04a-4e50-98f2-98db1b7caa21" containerName="collect-profiles" Nov 22 03:17:27 crc kubenswrapper[4922]: E1122 03:17:27.650893 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd426f86-ec43-4a7a-bde1-0430f01503f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.650908 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd426f86-ec43-4a7a-bde1-0430f01503f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.651222 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="47eee278-a04a-4e50-98f2-98db1b7caa21" containerName="collect-profiles" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.651251 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd426f86-ec43-4a7a-bde1-0430f01503f6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.652237 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.663515 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m"] Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.683307 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.684757 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.686011 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.686169 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.741733 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.742164 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5q7t\" (UniqueName: \"kubernetes.io/projected/203fb5ca-ff37-4321-a532-1fe2103cc82d-kube-api-access-w5q7t\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.742227 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.843609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.843724 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5q7t\" (UniqueName: \"kubernetes.io/projected/203fb5ca-ff37-4321-a532-1fe2103cc82d-kube-api-access-w5q7t\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.843752 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.849950 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.854646 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:27 crc kubenswrapper[4922]: I1122 03:17:27.862068 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5q7t\" (UniqueName: \"kubernetes.io/projected/203fb5ca-ff37-4321-a532-1fe2103cc82d-kube-api-access-w5q7t\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:28 crc kubenswrapper[4922]: I1122 03:17:28.002712 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:17:28 crc kubenswrapper[4922]: I1122 03:17:28.595063 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m"] Nov 22 03:17:28 crc kubenswrapper[4922]: W1122 03:17:28.618696 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod203fb5ca_ff37_4321_a532_1fe2103cc82d.slice/crio-72ec1cd1eaa368c7c057896b03c7440f5792a4bb864fea4e8a928ddc6b8b46f6 WatchSource:0}: Error finding container 72ec1cd1eaa368c7c057896b03c7440f5792a4bb864fea4e8a928ddc6b8b46f6: Status 404 returned error can't find the container with id 72ec1cd1eaa368c7c057896b03c7440f5792a4bb864fea4e8a928ddc6b8b46f6 Nov 22 03:17:29 crc kubenswrapper[4922]: I1122 03:17:29.551829 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" event={"ID":"203fb5ca-ff37-4321-a532-1fe2103cc82d","Type":"ContainerStarted","Data":"6a8371ea7583bc06b848889b966ef4a937fce6d6d52a83154c6cb9d1e5594b38"} Nov 22 03:17:29 crc kubenswrapper[4922]: I1122 03:17:29.552366 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" event={"ID":"203fb5ca-ff37-4321-a532-1fe2103cc82d","Type":"ContainerStarted","Data":"72ec1cd1eaa368c7c057896b03c7440f5792a4bb864fea4e8a928ddc6b8b46f6"} Nov 22 03:17:29 crc kubenswrapper[4922]: I1122 03:17:29.584687 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" podStartSLOduration=2.049347897 podStartE2EDuration="2.584653131s" podCreationTimestamp="2025-11-22 03:17:27 +0000 UTC" firstStartedPulling="2025-11-22 03:17:28.622064782 +0000 UTC m=+1484.660586674" lastFinishedPulling="2025-11-22 03:17:29.157370006 +0000 UTC m=+1485.195891908" observedRunningTime="2025-11-22 03:17:29.575828699 +0000 UTC m=+1485.614350681" watchObservedRunningTime="2025-11-22 03:17:29.584653131 +0000 UTC m=+1485.623175073" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.210603 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jcc87"] Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.214105 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.221421 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcc87"] Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.343804 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-utilities\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.343869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-catalog-content\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.343950 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jl9v\" (UniqueName: \"kubernetes.io/projected/ddf9ce5e-ba02-46af-94be-a4793c072412-kube-api-access-5jl9v\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.445566 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-utilities\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.445640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-catalog-content\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.445775 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jl9v\" (UniqueName: \"kubernetes.io/projected/ddf9ce5e-ba02-46af-94be-a4793c072412-kube-api-access-5jl9v\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.448225 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-utilities\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.448476 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-catalog-content\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.466902 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jl9v\" (UniqueName: \"kubernetes.io/projected/ddf9ce5e-ba02-46af-94be-a4793c072412-kube-api-access-5jl9v\") pod \"certified-operators-jcc87\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:32 crc kubenswrapper[4922]: I1122 03:17:32.547641 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:33 crc kubenswrapper[4922]: I1122 03:17:33.049479 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcc87"] Nov 22 03:17:33 crc kubenswrapper[4922]: W1122 03:17:33.057714 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf9ce5e_ba02_46af_94be_a4793c072412.slice/crio-aabe01ad97a1b2575b1af70b64b5e28cb25060dd907f071443b3f7fe6b9f5853 WatchSource:0}: Error finding container aabe01ad97a1b2575b1af70b64b5e28cb25060dd907f071443b3f7fe6b9f5853: Status 404 returned error can't find the container with id aabe01ad97a1b2575b1af70b64b5e28cb25060dd907f071443b3f7fe6b9f5853 Nov 22 03:17:33 crc kubenswrapper[4922]: E1122 03:17:33.485443 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf9ce5e_ba02_46af_94be_a4793c072412.slice/crio-991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf9ce5e_ba02_46af_94be_a4793c072412.slice/crio-conmon-991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:17:33 crc kubenswrapper[4922]: I1122 03:17:33.601663 4922 generic.go:334] "Generic (PLEG): container finished" podID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerID="991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867" exitCode=0 Nov 22 03:17:33 crc kubenswrapper[4922]: I1122 03:17:33.601737 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcc87" event={"ID":"ddf9ce5e-ba02-46af-94be-a4793c072412","Type":"ContainerDied","Data":"991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867"} Nov 22 03:17:33 crc kubenswrapper[4922]: I1122 03:17:33.601781 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcc87" event={"ID":"ddf9ce5e-ba02-46af-94be-a4793c072412","Type":"ContainerStarted","Data":"aabe01ad97a1b2575b1af70b64b5e28cb25060dd907f071443b3f7fe6b9f5853"} Nov 22 03:17:33 crc kubenswrapper[4922]: I1122 03:17:33.607438 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:17:35 crc kubenswrapper[4922]: I1122 03:17:35.621701 4922 generic.go:334] "Generic (PLEG): container finished" podID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerID="653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713" exitCode=0 Nov 22 03:17:35 crc kubenswrapper[4922]: I1122 03:17:35.621743 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcc87" event={"ID":"ddf9ce5e-ba02-46af-94be-a4793c072412","Type":"ContainerDied","Data":"653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713"} Nov 22 03:17:36 crc kubenswrapper[4922]: I1122 03:17:36.634925 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcc87" event={"ID":"ddf9ce5e-ba02-46af-94be-a4793c072412","Type":"ContainerStarted","Data":"4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19"} Nov 22 03:17:36 crc kubenswrapper[4922]: I1122 03:17:36.653060 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jcc87" podStartSLOduration=2.04633157 podStartE2EDuration="4.653041323s" podCreationTimestamp="2025-11-22 03:17:32 +0000 UTC" firstStartedPulling="2025-11-22 03:17:33.607012426 +0000 UTC m=+1489.645534358" lastFinishedPulling="2025-11-22 03:17:36.213722219 +0000 UTC m=+1492.252244111" observedRunningTime="2025-11-22 03:17:36.652416098 +0000 UTC m=+1492.690938000" watchObservedRunningTime="2025-11-22 03:17:36.653041323 +0000 UTC m=+1492.691563215" Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.109934 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.111162 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.111490 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.112640 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.112949 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" gracePeriod=600 Nov 22 03:17:41 crc kubenswrapper[4922]: E1122 03:17:41.325109 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.685968 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" exitCode=0 Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.686022 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb"} Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.686057 4922 scope.go:117] "RemoveContainer" containerID="40d76ae6e06f9784b7400ca69b8cc35e187346fc7004ab985de1150eb16fab3d" Nov 22 03:17:41 crc kubenswrapper[4922]: I1122 03:17:41.686833 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:17:41 crc kubenswrapper[4922]: E1122 03:17:41.687323 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:17:42 crc kubenswrapper[4922]: I1122 03:17:42.548708 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:42 crc kubenswrapper[4922]: I1122 03:17:42.548756 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:42 crc kubenswrapper[4922]: I1122 03:17:42.606647 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:42 crc kubenswrapper[4922]: I1122 03:17:42.758742 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:42 crc kubenswrapper[4922]: I1122 03:17:42.845531 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcc87"] Nov 22 03:17:44 crc kubenswrapper[4922]: I1122 03:17:44.712209 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jcc87" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="registry-server" containerID="cri-o://4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19" gracePeriod=2 Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.290909 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.466715 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-utilities\") pod \"ddf9ce5e-ba02-46af-94be-a4793c072412\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.467835 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-utilities" (OuterVolumeSpecName: "utilities") pod "ddf9ce5e-ba02-46af-94be-a4793c072412" (UID: "ddf9ce5e-ba02-46af-94be-a4793c072412"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.468078 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-catalog-content\") pod \"ddf9ce5e-ba02-46af-94be-a4793c072412\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.473452 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jl9v\" (UniqueName: \"kubernetes.io/projected/ddf9ce5e-ba02-46af-94be-a4793c072412-kube-api-access-5jl9v\") pod \"ddf9ce5e-ba02-46af-94be-a4793c072412\" (UID: \"ddf9ce5e-ba02-46af-94be-a4793c072412\") " Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.474970 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.478473 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf9ce5e-ba02-46af-94be-a4793c072412-kube-api-access-5jl9v" (OuterVolumeSpecName: "kube-api-access-5jl9v") pod "ddf9ce5e-ba02-46af-94be-a4793c072412" (UID: "ddf9ce5e-ba02-46af-94be-a4793c072412"). InnerVolumeSpecName "kube-api-access-5jl9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.522490 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddf9ce5e-ba02-46af-94be-a4793c072412" (UID: "ddf9ce5e-ba02-46af-94be-a4793c072412"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.576626 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddf9ce5e-ba02-46af-94be-a4793c072412-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.576655 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jl9v\" (UniqueName: \"kubernetes.io/projected/ddf9ce5e-ba02-46af-94be-a4793c072412-kube-api-access-5jl9v\") on node \"crc\" DevicePath \"\"" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.723076 4922 generic.go:334] "Generic (PLEG): container finished" podID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerID="4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19" exitCode=0 Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.723121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcc87" event={"ID":"ddf9ce5e-ba02-46af-94be-a4793c072412","Type":"ContainerDied","Data":"4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19"} Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.723133 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcc87" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.723150 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcc87" event={"ID":"ddf9ce5e-ba02-46af-94be-a4793c072412","Type":"ContainerDied","Data":"aabe01ad97a1b2575b1af70b64b5e28cb25060dd907f071443b3f7fe6b9f5853"} Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.723169 4922 scope.go:117] "RemoveContainer" containerID="4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.747054 4922 scope.go:117] "RemoveContainer" containerID="653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.775373 4922 scope.go:117] "RemoveContainer" containerID="991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.778078 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcc87"] Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.810380 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jcc87"] Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.842426 4922 scope.go:117] "RemoveContainer" containerID="4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19" Nov 22 03:17:46 crc kubenswrapper[4922]: E1122 03:17:45.843241 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19\": container with ID starting with 4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19 not found: ID does not exist" containerID="4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.843276 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19"} err="failed to get container status \"4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19\": rpc error: code = NotFound desc = could not find container \"4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19\": container with ID starting with 4a9ca5d6d6e780a41503b24c2cc8262b22b3e812d4d72d8ea948ddec2bcf6f19 not found: ID does not exist" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.843297 4922 scope.go:117] "RemoveContainer" containerID="653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713" Nov 22 03:17:46 crc kubenswrapper[4922]: E1122 03:17:45.843782 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713\": container with ID starting with 653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713 not found: ID does not exist" containerID="653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.843807 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713"} err="failed to get container status \"653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713\": rpc error: code = NotFound desc = could not find container \"653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713\": container with ID starting with 653dff6e542444a46566bcb511a0520d02969763376b058a757f2fed64f2e713 not found: ID does not exist" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.843827 4922 scope.go:117] "RemoveContainer" containerID="991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867" Nov 22 03:17:46 crc kubenswrapper[4922]: E1122 03:17:45.844190 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867\": container with ID starting with 991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867 not found: ID does not exist" containerID="991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867" Nov 22 03:17:46 crc kubenswrapper[4922]: I1122 03:17:45.844224 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867"} err="failed to get container status \"991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867\": rpc error: code = NotFound desc = could not find container \"991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867\": container with ID starting with 991f71ff283ca9801298c347f0a9d505df6b33d1df8abdcdaa2169fb316ab867 not found: ID does not exist" Nov 22 03:17:47 crc kubenswrapper[4922]: I1122 03:17:47.314444 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" path="/var/lib/kubelet/pods/ddf9ce5e-ba02-46af-94be-a4793c072412/volumes" Nov 22 03:17:55 crc kubenswrapper[4922]: I1122 03:17:55.307228 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:17:55 crc kubenswrapper[4922]: E1122 03:17:55.308252 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:17:55 crc kubenswrapper[4922]: I1122 03:17:55.642789 4922 scope.go:117] "RemoveContainer" containerID="5e3fe0173c097deab6955e184404673175c9385dacd5472b97e131f0d2081f0b" Nov 22 03:17:55 crc kubenswrapper[4922]: I1122 03:17:55.670480 4922 scope.go:117] "RemoveContainer" containerID="2407c19e18d8987997a9fb965adc8f6148e8e8b87da71df2007cc8bea77bdef9" Nov 22 03:17:55 crc kubenswrapper[4922]: I1122 03:17:55.688194 4922 scope.go:117] "RemoveContainer" containerID="8a11c92c72f011f0b87b1f41c4fc0313bbdf529cd1fc438ed6cea3c7c37f5fce" Nov 22 03:18:08 crc kubenswrapper[4922]: I1122 03:18:08.301003 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:18:08 crc kubenswrapper[4922]: E1122 03:18:08.303415 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:18:23 crc kubenswrapper[4922]: I1122 03:18:23.302717 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:18:23 crc kubenswrapper[4922]: E1122 03:18:23.303656 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.572617 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nffrb"] Nov 22 03:18:27 crc kubenswrapper[4922]: E1122 03:18:27.573953 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="registry-server" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.573981 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="registry-server" Nov 22 03:18:27 crc kubenswrapper[4922]: E1122 03:18:27.574011 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="extract-content" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.574023 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="extract-content" Nov 22 03:18:27 crc kubenswrapper[4922]: E1122 03:18:27.574068 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="extract-utilities" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.574081 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="extract-utilities" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.574448 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf9ce5e-ba02-46af-94be-a4793c072412" containerName="registry-server" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.576807 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.593109 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nffrb"] Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.629435 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-catalog-content\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.629500 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-utilities\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.629671 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmgc\" (UniqueName: \"kubernetes.io/projected/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-kube-api-access-2kmgc\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.731441 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-catalog-content\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.731673 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-utilities\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.731808 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmgc\" (UniqueName: \"kubernetes.io/projected/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-kube-api-access-2kmgc\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.732279 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-utilities\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.732338 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-catalog-content\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.760400 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmgc\" (UniqueName: \"kubernetes.io/projected/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-kube-api-access-2kmgc\") pod \"redhat-operators-nffrb\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:27 crc kubenswrapper[4922]: I1122 03:18:27.949630 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:28 crc kubenswrapper[4922]: I1122 03:18:28.432886 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nffrb"] Nov 22 03:18:29 crc kubenswrapper[4922]: I1122 03:18:29.220943 4922 generic.go:334] "Generic (PLEG): container finished" podID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerID="22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242" exitCode=0 Nov 22 03:18:29 crc kubenswrapper[4922]: I1122 03:18:29.221027 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerDied","Data":"22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242"} Nov 22 03:18:29 crc kubenswrapper[4922]: I1122 03:18:29.221398 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerStarted","Data":"6a499e2c9a38b8d11f3fc3023636a2c0583a54caceb266347dcb1fa8b19c1e02"} Nov 22 03:18:31 crc kubenswrapper[4922]: I1122 03:18:31.253593 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerStarted","Data":"28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf"} Nov 22 03:18:32 crc kubenswrapper[4922]: I1122 03:18:32.272108 4922 generic.go:334] "Generic (PLEG): container finished" podID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerID="28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf" exitCode=0 Nov 22 03:18:32 crc kubenswrapper[4922]: I1122 03:18:32.272332 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerDied","Data":"28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf"} Nov 22 03:18:34 crc kubenswrapper[4922]: I1122 03:18:34.297064 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerStarted","Data":"be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e"} Nov 22 03:18:34 crc kubenswrapper[4922]: I1122 03:18:34.317906 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nffrb" podStartSLOduration=2.59558823 podStartE2EDuration="7.317879052s" podCreationTimestamp="2025-11-22 03:18:27 +0000 UTC" firstStartedPulling="2025-11-22 03:18:29.224387378 +0000 UTC m=+1545.262909310" lastFinishedPulling="2025-11-22 03:18:33.9466782 +0000 UTC m=+1549.985200132" observedRunningTime="2025-11-22 03:18:34.314918781 +0000 UTC m=+1550.353440693" watchObservedRunningTime="2025-11-22 03:18:34.317879052 +0000 UTC m=+1550.356400964" Nov 22 03:18:35 crc kubenswrapper[4922]: I1122 03:18:35.307666 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:18:35 crc kubenswrapper[4922]: E1122 03:18:35.308204 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:18:37 crc kubenswrapper[4922]: I1122 03:18:37.949908 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:37 crc kubenswrapper[4922]: I1122 03:18:37.950398 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:39 crc kubenswrapper[4922]: I1122 03:18:39.014838 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nffrb" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="registry-server" probeResult="failure" output=< Nov 22 03:18:39 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:18:39 crc kubenswrapper[4922]: > Nov 22 03:18:42 crc kubenswrapper[4922]: I1122 03:18:42.399748 4922 generic.go:334] "Generic (PLEG): container finished" podID="203fb5ca-ff37-4321-a532-1fe2103cc82d" containerID="6a8371ea7583bc06b848889b966ef4a937fce6d6d52a83154c6cb9d1e5594b38" exitCode=0 Nov 22 03:18:42 crc kubenswrapper[4922]: I1122 03:18:42.399836 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" event={"ID":"203fb5ca-ff37-4321-a532-1fe2103cc82d","Type":"ContainerDied","Data":"6a8371ea7583bc06b848889b966ef4a937fce6d6d52a83154c6cb9d1e5594b38"} Nov 22 03:18:43 crc kubenswrapper[4922]: I1122 03:18:43.908674 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.078534 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-inventory\") pod \"203fb5ca-ff37-4321-a532-1fe2103cc82d\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.078643 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-ssh-key\") pod \"203fb5ca-ff37-4321-a532-1fe2103cc82d\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.078681 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5q7t\" (UniqueName: \"kubernetes.io/projected/203fb5ca-ff37-4321-a532-1fe2103cc82d-kube-api-access-w5q7t\") pod \"203fb5ca-ff37-4321-a532-1fe2103cc82d\" (UID: \"203fb5ca-ff37-4321-a532-1fe2103cc82d\") " Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.086906 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203fb5ca-ff37-4321-a532-1fe2103cc82d-kube-api-access-w5q7t" (OuterVolumeSpecName: "kube-api-access-w5q7t") pod "203fb5ca-ff37-4321-a532-1fe2103cc82d" (UID: "203fb5ca-ff37-4321-a532-1fe2103cc82d"). InnerVolumeSpecName "kube-api-access-w5q7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.106455 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "203fb5ca-ff37-4321-a532-1fe2103cc82d" (UID: "203fb5ca-ff37-4321-a532-1fe2103cc82d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.115807 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-inventory" (OuterVolumeSpecName: "inventory") pod "203fb5ca-ff37-4321-a532-1fe2103cc82d" (UID: "203fb5ca-ff37-4321-a532-1fe2103cc82d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.181251 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.181298 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203fb5ca-ff37-4321-a532-1fe2103cc82d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.181317 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5q7t\" (UniqueName: \"kubernetes.io/projected/203fb5ca-ff37-4321-a532-1fe2103cc82d-kube-api-access-w5q7t\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.427417 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" event={"ID":"203fb5ca-ff37-4321-a532-1fe2103cc82d","Type":"ContainerDied","Data":"72ec1cd1eaa368c7c057896b03c7440f5792a4bb864fea4e8a928ddc6b8b46f6"} Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.427485 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72ec1cd1eaa368c7c057896b03c7440f5792a4bb864fea4e8a928ddc6b8b46f6" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.427554 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.545103 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7"] Nov 22 03:18:44 crc kubenswrapper[4922]: E1122 03:18:44.545756 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203fb5ca-ff37-4321-a532-1fe2103cc82d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.545787 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="203fb5ca-ff37-4321-a532-1fe2103cc82d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.546160 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="203fb5ca-ff37-4321-a532-1fe2103cc82d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.548349 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.551754 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.552319 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.552584 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.554308 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.555824 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7"] Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.701623 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.702277 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.703131 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4567b\" (UniqueName: \"kubernetes.io/projected/01c85480-509f-45fa-b81f-2e29ba749afb-kube-api-access-4567b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.805158 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4567b\" (UniqueName: \"kubernetes.io/projected/01c85480-509f-45fa-b81f-2e29ba749afb-kube-api-access-4567b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.805328 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.805613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.810480 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.812790 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.828873 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4567b\" (UniqueName: \"kubernetes.io/projected/01c85480-509f-45fa-b81f-2e29ba749afb-kube-api-access-4567b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:44 crc kubenswrapper[4922]: I1122 03:18:44.884463 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:45 crc kubenswrapper[4922]: I1122 03:18:45.287012 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7"] Nov 22 03:18:45 crc kubenswrapper[4922]: W1122 03:18:45.289063 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01c85480_509f_45fa_b81f_2e29ba749afb.slice/crio-ca08a55e4e4c36c365a3fdfd52d3e70b24416f392c6c8b8738a36372d8734912 WatchSource:0}: Error finding container ca08a55e4e4c36c365a3fdfd52d3e70b24416f392c6c8b8738a36372d8734912: Status 404 returned error can't find the container with id ca08a55e4e4c36c365a3fdfd52d3e70b24416f392c6c8b8738a36372d8734912 Nov 22 03:18:45 crc kubenswrapper[4922]: I1122 03:18:45.436502 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" event={"ID":"01c85480-509f-45fa-b81f-2e29ba749afb","Type":"ContainerStarted","Data":"ca08a55e4e4c36c365a3fdfd52d3e70b24416f392c6c8b8738a36372d8734912"} Nov 22 03:18:45 crc kubenswrapper[4922]: I1122 03:18:45.788816 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:18:46 crc kubenswrapper[4922]: I1122 03:18:46.300118 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:18:46 crc kubenswrapper[4922]: E1122 03:18:46.300571 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:18:46 crc kubenswrapper[4922]: I1122 03:18:46.448474 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" event={"ID":"01c85480-509f-45fa-b81f-2e29ba749afb","Type":"ContainerStarted","Data":"e3523a0b0c3b2ae38bbd18cd6ffd4b235fb64fadcc0d23ad6c95b92f02cd7a00"} Nov 22 03:18:46 crc kubenswrapper[4922]: I1122 03:18:46.483573 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" podStartSLOduration=1.9886039229999999 podStartE2EDuration="2.483556449s" podCreationTimestamp="2025-11-22 03:18:44 +0000 UTC" firstStartedPulling="2025-11-22 03:18:45.290610972 +0000 UTC m=+1561.329132864" lastFinishedPulling="2025-11-22 03:18:45.785563458 +0000 UTC m=+1561.824085390" observedRunningTime="2025-11-22 03:18:46.467531665 +0000 UTC m=+1562.506053597" watchObservedRunningTime="2025-11-22 03:18:46.483556449 +0000 UTC m=+1562.522078351" Nov 22 03:18:48 crc kubenswrapper[4922]: I1122 03:18:48.011892 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:48 crc kubenswrapper[4922]: I1122 03:18:48.067041 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:48 crc kubenswrapper[4922]: I1122 03:18:48.258671 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nffrb"] Nov 22 03:18:49 crc kubenswrapper[4922]: I1122 03:18:49.484805 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nffrb" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="registry-server" containerID="cri-o://be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e" gracePeriod=2 Nov 22 03:18:49 crc kubenswrapper[4922]: I1122 03:18:49.944668 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.012830 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-utilities\") pod \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.012928 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kmgc\" (UniqueName: \"kubernetes.io/projected/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-kube-api-access-2kmgc\") pod \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.012999 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-catalog-content\") pod \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\" (UID: \"3a42e71c-6ddd-4653-8f7f-d87a02996c5b\") " Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.014057 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-utilities" (OuterVolumeSpecName: "utilities") pod "3a42e71c-6ddd-4653-8f7f-d87a02996c5b" (UID: "3a42e71c-6ddd-4653-8f7f-d87a02996c5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.017905 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-kube-api-access-2kmgc" (OuterVolumeSpecName: "kube-api-access-2kmgc") pod "3a42e71c-6ddd-4653-8f7f-d87a02996c5b" (UID: "3a42e71c-6ddd-4653-8f7f-d87a02996c5b"). InnerVolumeSpecName "kube-api-access-2kmgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.108758 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a42e71c-6ddd-4653-8f7f-d87a02996c5b" (UID: "3a42e71c-6ddd-4653-8f7f-d87a02996c5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.114594 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kmgc\" (UniqueName: \"kubernetes.io/projected/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-kube-api-access-2kmgc\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.114639 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.114658 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a42e71c-6ddd-4653-8f7f-d87a02996c5b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.500191 4922 generic.go:334] "Generic (PLEG): container finished" podID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerID="be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e" exitCode=0 Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.500250 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerDied","Data":"be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e"} Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.500292 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nffrb" event={"ID":"3a42e71c-6ddd-4653-8f7f-d87a02996c5b","Type":"ContainerDied","Data":"6a499e2c9a38b8d11f3fc3023636a2c0583a54caceb266347dcb1fa8b19c1e02"} Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.500318 4922 scope.go:117] "RemoveContainer" containerID="be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.500386 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nffrb" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.552095 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nffrb"] Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.554653 4922 scope.go:117] "RemoveContainer" containerID="28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.562057 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nffrb"] Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.610533 4922 scope.go:117] "RemoveContainer" containerID="22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.658803 4922 scope.go:117] "RemoveContainer" containerID="be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e" Nov 22 03:18:50 crc kubenswrapper[4922]: E1122 03:18:50.659765 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e\": container with ID starting with be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e not found: ID does not exist" containerID="be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.659819 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e"} err="failed to get container status \"be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e\": rpc error: code = NotFound desc = could not find container \"be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e\": container with ID starting with be3c3b01dc6c2c3fa0dda735fce7ae235a41e9decfa19e6b7f2b579d41ceac2e not found: ID does not exist" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.659891 4922 scope.go:117] "RemoveContainer" containerID="28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf" Nov 22 03:18:50 crc kubenswrapper[4922]: E1122 03:18:50.660780 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf\": container with ID starting with 28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf not found: ID does not exist" containerID="28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.660825 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf"} err="failed to get container status \"28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf\": rpc error: code = NotFound desc = could not find container \"28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf\": container with ID starting with 28a4e4540f5ad82b21b6f659879cd7083172dff0685fe3144792889c446284cf not found: ID does not exist" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.660875 4922 scope.go:117] "RemoveContainer" containerID="22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242" Nov 22 03:18:50 crc kubenswrapper[4922]: E1122 03:18:50.661304 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242\": container with ID starting with 22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242 not found: ID does not exist" containerID="22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242" Nov 22 03:18:50 crc kubenswrapper[4922]: I1122 03:18:50.661374 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242"} err="failed to get container status \"22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242\": rpc error: code = NotFound desc = could not find container \"22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242\": container with ID starting with 22bd19ec6f7ad897a91f32989ce762d8e2a030f2435e667b6f880594b643f242 not found: ID does not exist" Nov 22 03:18:51 crc kubenswrapper[4922]: I1122 03:18:51.320091 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" path="/var/lib/kubelet/pods/3a42e71c-6ddd-4653-8f7f-d87a02996c5b/volumes" Nov 22 03:18:51 crc kubenswrapper[4922]: I1122 03:18:51.531520 4922 generic.go:334] "Generic (PLEG): container finished" podID="01c85480-509f-45fa-b81f-2e29ba749afb" containerID="e3523a0b0c3b2ae38bbd18cd6ffd4b235fb64fadcc0d23ad6c95b92f02cd7a00" exitCode=0 Nov 22 03:18:51 crc kubenswrapper[4922]: I1122 03:18:51.532339 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" event={"ID":"01c85480-509f-45fa-b81f-2e29ba749afb","Type":"ContainerDied","Data":"e3523a0b0c3b2ae38bbd18cd6ffd4b235fb64fadcc0d23ad6c95b92f02cd7a00"} Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.290824 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.395607 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-inventory\") pod \"01c85480-509f-45fa-b81f-2e29ba749afb\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.395665 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-ssh-key\") pod \"01c85480-509f-45fa-b81f-2e29ba749afb\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.395810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4567b\" (UniqueName: \"kubernetes.io/projected/01c85480-509f-45fa-b81f-2e29ba749afb-kube-api-access-4567b\") pod \"01c85480-509f-45fa-b81f-2e29ba749afb\" (UID: \"01c85480-509f-45fa-b81f-2e29ba749afb\") " Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.406993 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c85480-509f-45fa-b81f-2e29ba749afb-kube-api-access-4567b" (OuterVolumeSpecName: "kube-api-access-4567b") pod "01c85480-509f-45fa-b81f-2e29ba749afb" (UID: "01c85480-509f-45fa-b81f-2e29ba749afb"). InnerVolumeSpecName "kube-api-access-4567b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.430787 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-inventory" (OuterVolumeSpecName: "inventory") pod "01c85480-509f-45fa-b81f-2e29ba749afb" (UID: "01c85480-509f-45fa-b81f-2e29ba749afb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.436078 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "01c85480-509f-45fa-b81f-2e29ba749afb" (UID: "01c85480-509f-45fa-b81f-2e29ba749afb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.500702 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4567b\" (UniqueName: \"kubernetes.io/projected/01c85480-509f-45fa-b81f-2e29ba749afb-kube-api-access-4567b\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.500735 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.500744 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/01c85480-509f-45fa-b81f-2e29ba749afb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.557029 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" event={"ID":"01c85480-509f-45fa-b81f-2e29ba749afb","Type":"ContainerDied","Data":"ca08a55e4e4c36c365a3fdfd52d3e70b24416f392c6c8b8738a36372d8734912"} Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.557076 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca08a55e4e4c36c365a3fdfd52d3e70b24416f392c6c8b8738a36372d8734912" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.557127 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648051 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w"] Nov 22 03:18:53 crc kubenswrapper[4922]: E1122 03:18:53.648513 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c85480-509f-45fa-b81f-2e29ba749afb" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648536 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c85480-509f-45fa-b81f-2e29ba749afb" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:53 crc kubenswrapper[4922]: E1122 03:18:53.648549 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="registry-server" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648557 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="registry-server" Nov 22 03:18:53 crc kubenswrapper[4922]: E1122 03:18:53.648595 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="extract-utilities" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648606 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="extract-utilities" Nov 22 03:18:53 crc kubenswrapper[4922]: E1122 03:18:53.648618 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="extract-content" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648626 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="extract-content" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648832 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a42e71c-6ddd-4653-8f7f-d87a02996c5b" containerName="registry-server" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.648875 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c85480-509f-45fa-b81f-2e29ba749afb" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.649622 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.652357 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.652889 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.652913 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.653950 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.672651 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w"] Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.704424 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.704474 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljhqd\" (UniqueName: \"kubernetes.io/projected/2021c745-dc35-460d-abfd-c15cab66eea7-kube-api-access-ljhqd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.704541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.805881 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.806231 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljhqd\" (UniqueName: \"kubernetes.io/projected/2021c745-dc35-460d-abfd-c15cab66eea7-kube-api-access-ljhqd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.806297 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.811504 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.813616 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.830905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljhqd\" (UniqueName: \"kubernetes.io/projected/2021c745-dc35-460d-abfd-c15cab66eea7-kube-api-access-ljhqd\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4nd6w\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:53 crc kubenswrapper[4922]: I1122 03:18:53.972428 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:18:54 crc kubenswrapper[4922]: I1122 03:18:54.567790 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w"] Nov 22 03:18:55 crc kubenswrapper[4922]: I1122 03:18:55.594041 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" event={"ID":"2021c745-dc35-460d-abfd-c15cab66eea7","Type":"ContainerStarted","Data":"e44d0ff87c1b74ab73186adab7ccca698f545f1c78e098ef17eec194437174bb"} Nov 22 03:18:56 crc kubenswrapper[4922]: I1122 03:18:56.038985 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7pzg4"] Nov 22 03:18:56 crc kubenswrapper[4922]: I1122 03:18:56.053605 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vd5np"] Nov 22 03:18:56 crc kubenswrapper[4922]: I1122 03:18:56.065808 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vd5np"] Nov 22 03:18:56 crc kubenswrapper[4922]: I1122 03:18:56.077081 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7pzg4"] Nov 22 03:18:56 crc kubenswrapper[4922]: I1122 03:18:56.606137 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" event={"ID":"2021c745-dc35-460d-abfd-c15cab66eea7","Type":"ContainerStarted","Data":"3a85ad8b51f31332fd831641266240e37c3359dd9a4b13805b1f702ef439c4aa"} Nov 22 03:18:56 crc kubenswrapper[4922]: I1122 03:18:56.627761 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" podStartSLOduration=2.475423607 podStartE2EDuration="3.6277333s" podCreationTimestamp="2025-11-22 03:18:53 +0000 UTC" firstStartedPulling="2025-11-22 03:18:54.582229111 +0000 UTC m=+1570.620751013" lastFinishedPulling="2025-11-22 03:18:55.734538774 +0000 UTC m=+1571.773060706" observedRunningTime="2025-11-22 03:18:56.626732166 +0000 UTC m=+1572.665254078" watchObservedRunningTime="2025-11-22 03:18:56.6277333 +0000 UTC m=+1572.666255222" Nov 22 03:18:57 crc kubenswrapper[4922]: I1122 03:18:57.033585 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-56hzv"] Nov 22 03:18:57 crc kubenswrapper[4922]: I1122 03:18:57.047195 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-56hzv"] Nov 22 03:18:57 crc kubenswrapper[4922]: I1122 03:18:57.314564 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca3bda2-7048-4e36-99af-fa68021dffee" path="/var/lib/kubelet/pods/7ca3bda2-7048-4e36-99af-fa68021dffee/volumes" Nov 22 03:18:57 crc kubenswrapper[4922]: I1122 03:18:57.315368 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e38312-9328-45d0-80de-9d4a3e94b309" path="/var/lib/kubelet/pods/95e38312-9328-45d0-80de-9d4a3e94b309/volumes" Nov 22 03:18:57 crc kubenswrapper[4922]: I1122 03:18:57.315937 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21024fc-9d0a-42e7-932f-ecd2d648d975" path="/var/lib/kubelet/pods/f21024fc-9d0a-42e7-932f-ecd2d648d975/volumes" Nov 22 03:19:01 crc kubenswrapper[4922]: I1122 03:19:01.301144 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:19:01 crc kubenswrapper[4922]: E1122 03:19:01.303232 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:19:09 crc kubenswrapper[4922]: I1122 03:19:09.057406 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4209-account-create-kvqs9"] Nov 22 03:19:09 crc kubenswrapper[4922]: I1122 03:19:09.074292 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5adc-account-create-l8gt8"] Nov 22 03:19:09 crc kubenswrapper[4922]: I1122 03:19:09.085906 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5adc-account-create-l8gt8"] Nov 22 03:19:09 crc kubenswrapper[4922]: I1122 03:19:09.094011 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4209-account-create-kvqs9"] Nov 22 03:19:09 crc kubenswrapper[4922]: I1122 03:19:09.321681 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406317f3-f5ff-45bb-bc58-847129dd5652" path="/var/lib/kubelet/pods/406317f3-f5ff-45bb-bc58-847129dd5652/volumes" Nov 22 03:19:09 crc kubenswrapper[4922]: I1122 03:19:09.322645 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="968a0f99-7d65-431f-a242-11ac5c861a27" path="/var/lib/kubelet/pods/968a0f99-7d65-431f-a242-11ac5c861a27/volumes" Nov 22 03:19:10 crc kubenswrapper[4922]: I1122 03:19:10.049020 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-be68-account-create-jx25h"] Nov 22 03:19:10 crc kubenswrapper[4922]: I1122 03:19:10.060100 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-be68-account-create-jx25h"] Nov 22 03:19:11 crc kubenswrapper[4922]: I1122 03:19:11.320606 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4af2488-ac01-43c3-9c7f-672a2d20456b" path="/var/lib/kubelet/pods/f4af2488-ac01-43c3-9c7f-672a2d20456b/volumes" Nov 22 03:19:14 crc kubenswrapper[4922]: I1122 03:19:14.301555 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:19:14 crc kubenswrapper[4922]: E1122 03:19:14.302127 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:19:29 crc kubenswrapper[4922]: I1122 03:19:29.301364 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:19:29 crc kubenswrapper[4922]: E1122 03:19:29.302457 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:19:31 crc kubenswrapper[4922]: I1122 03:19:31.041437 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qhvqh"] Nov 22 03:19:31 crc kubenswrapper[4922]: I1122 03:19:31.056585 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qhvqh"] Nov 22 03:19:31 crc kubenswrapper[4922]: I1122 03:19:31.315959 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30790286-43e6-435e-9d57-a69b795cc1b5" path="/var/lib/kubelet/pods/30790286-43e6-435e-9d57-a69b795cc1b5/volumes" Nov 22 03:19:34 crc kubenswrapper[4922]: I1122 03:19:34.042500 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7lrkg"] Nov 22 03:19:34 crc kubenswrapper[4922]: I1122 03:19:34.063065 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5pcsk"] Nov 22 03:19:34 crc kubenswrapper[4922]: I1122 03:19:34.073274 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7lrkg"] Nov 22 03:19:34 crc kubenswrapper[4922]: I1122 03:19:34.082065 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5pcsk"] Nov 22 03:19:35 crc kubenswrapper[4922]: I1122 03:19:35.311983 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb78fbe1-7a49-4c07-88cc-eb13d06d3723" path="/var/lib/kubelet/pods/eb78fbe1-7a49-4c07-88cc-eb13d06d3723/volumes" Nov 22 03:19:35 crc kubenswrapper[4922]: I1122 03:19:35.313020 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9cb77a8-897d-4f9a-9cb0-05d1a81e903a" path="/var/lib/kubelet/pods/f9cb77a8-897d-4f9a-9cb0-05d1a81e903a/volumes" Nov 22 03:19:37 crc kubenswrapper[4922]: I1122 03:19:37.054633 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wbr6m"] Nov 22 03:19:37 crc kubenswrapper[4922]: I1122 03:19:37.065574 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wbr6m"] Nov 22 03:19:37 crc kubenswrapper[4922]: I1122 03:19:37.318284 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b763fe0e-98d2-4e23-8629-a14f68e3e8b8" path="/var/lib/kubelet/pods/b763fe0e-98d2-4e23-8629-a14f68e3e8b8/volumes" Nov 22 03:19:38 crc kubenswrapper[4922]: I1122 03:19:38.037488 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bp5gq"] Nov 22 03:19:38 crc kubenswrapper[4922]: I1122 03:19:38.052927 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bp5gq"] Nov 22 03:19:39 crc kubenswrapper[4922]: I1122 03:19:39.320517 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56cc5718-880b-43d3-9f3a-2a418797cf1f" path="/var/lib/kubelet/pods/56cc5718-880b-43d3-9f3a-2a418797cf1f/volumes" Nov 22 03:19:40 crc kubenswrapper[4922]: I1122 03:19:40.139714 4922 generic.go:334] "Generic (PLEG): container finished" podID="2021c745-dc35-460d-abfd-c15cab66eea7" containerID="3a85ad8b51f31332fd831641266240e37c3359dd9a4b13805b1f702ef439c4aa" exitCode=0 Nov 22 03:19:40 crc kubenswrapper[4922]: I1122 03:19:40.139827 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" event={"ID":"2021c745-dc35-460d-abfd-c15cab66eea7","Type":"ContainerDied","Data":"3a85ad8b51f31332fd831641266240e37c3359dd9a4b13805b1f702ef439c4aa"} Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.302128 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:19:41 crc kubenswrapper[4922]: E1122 03:19:41.302405 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.598514 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.720214 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-inventory\") pod \"2021c745-dc35-460d-abfd-c15cab66eea7\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.720296 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljhqd\" (UniqueName: \"kubernetes.io/projected/2021c745-dc35-460d-abfd-c15cab66eea7-kube-api-access-ljhqd\") pod \"2021c745-dc35-460d-abfd-c15cab66eea7\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.720419 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-ssh-key\") pod \"2021c745-dc35-460d-abfd-c15cab66eea7\" (UID: \"2021c745-dc35-460d-abfd-c15cab66eea7\") " Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.727902 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2021c745-dc35-460d-abfd-c15cab66eea7-kube-api-access-ljhqd" (OuterVolumeSpecName: "kube-api-access-ljhqd") pod "2021c745-dc35-460d-abfd-c15cab66eea7" (UID: "2021c745-dc35-460d-abfd-c15cab66eea7"). InnerVolumeSpecName "kube-api-access-ljhqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.767181 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2021c745-dc35-460d-abfd-c15cab66eea7" (UID: "2021c745-dc35-460d-abfd-c15cab66eea7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.768098 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-inventory" (OuterVolumeSpecName: "inventory") pod "2021c745-dc35-460d-abfd-c15cab66eea7" (UID: "2021c745-dc35-460d-abfd-c15cab66eea7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.822496 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljhqd\" (UniqueName: \"kubernetes.io/projected/2021c745-dc35-460d-abfd-c15cab66eea7-kube-api-access-ljhqd\") on node \"crc\" DevicePath \"\"" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.822536 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:19:41 crc kubenswrapper[4922]: I1122 03:19:41.822554 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2021c745-dc35-460d-abfd-c15cab66eea7-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.176004 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" event={"ID":"2021c745-dc35-460d-abfd-c15cab66eea7","Type":"ContainerDied","Data":"e44d0ff87c1b74ab73186adab7ccca698f545f1c78e098ef17eec194437174bb"} Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.176055 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.176072 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44d0ff87c1b74ab73186adab7ccca698f545f1c78e098ef17eec194437174bb" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.269602 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz"] Nov 22 03:19:42 crc kubenswrapper[4922]: E1122 03:19:42.269979 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2021c745-dc35-460d-abfd-c15cab66eea7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.270000 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2021c745-dc35-460d-abfd-c15cab66eea7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.270212 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2021c745-dc35-460d-abfd-c15cab66eea7" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.270778 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.273137 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.276125 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.276189 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.276513 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.287794 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz"] Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.331084 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.332179 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wrp\" (UniqueName: \"kubernetes.io/projected/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-kube-api-access-g2wrp\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.332303 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.433644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.433767 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wrp\" (UniqueName: \"kubernetes.io/projected/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-kube-api-access-g2wrp\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.433829 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.444397 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.447916 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.458114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wrp\" (UniqueName: \"kubernetes.io/projected/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-kube-api-access-g2wrp\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:42 crc kubenswrapper[4922]: I1122 03:19:42.589113 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:43 crc kubenswrapper[4922]: I1122 03:19:43.210672 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz"] Nov 22 03:19:44 crc kubenswrapper[4922]: I1122 03:19:44.200478 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" event={"ID":"84c9b4a5-79e5-45a7-9de2-2dead3caddaa","Type":"ContainerStarted","Data":"56fcdcf41c6041a000a2cb5a1d07bc26282abd2b863118552cfe31e5818ffd0b"} Nov 22 03:19:44 crc kubenswrapper[4922]: I1122 03:19:44.201061 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" event={"ID":"84c9b4a5-79e5-45a7-9de2-2dead3caddaa","Type":"ContainerStarted","Data":"1c659e038ddb235e0b2fc01f652867c8f9945f8d79e3c24fd1789fa91a0fac05"} Nov 22 03:19:44 crc kubenswrapper[4922]: I1122 03:19:44.226420 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" podStartSLOduration=1.659279452 podStartE2EDuration="2.226389287s" podCreationTimestamp="2025-11-22 03:19:42 +0000 UTC" firstStartedPulling="2025-11-22 03:19:43.209646731 +0000 UTC m=+1619.248168623" lastFinishedPulling="2025-11-22 03:19:43.776756556 +0000 UTC m=+1619.815278458" observedRunningTime="2025-11-22 03:19:44.2223538 +0000 UTC m=+1620.260875762" watchObservedRunningTime="2025-11-22 03:19:44.226389287 +0000 UTC m=+1620.264911219" Nov 22 03:19:49 crc kubenswrapper[4922]: I1122 03:19:49.252967 4922 generic.go:334] "Generic (PLEG): container finished" podID="84c9b4a5-79e5-45a7-9de2-2dead3caddaa" containerID="56fcdcf41c6041a000a2cb5a1d07bc26282abd2b863118552cfe31e5818ffd0b" exitCode=0 Nov 22 03:19:49 crc kubenswrapper[4922]: I1122 03:19:49.253066 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" event={"ID":"84c9b4a5-79e5-45a7-9de2-2dead3caddaa","Type":"ContainerDied","Data":"56fcdcf41c6041a000a2cb5a1d07bc26282abd2b863118552cfe31e5818ffd0b"} Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.704368 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.800907 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-inventory\") pod \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.801180 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2wrp\" (UniqueName: \"kubernetes.io/projected/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-kube-api-access-g2wrp\") pod \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.801252 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-ssh-key\") pod \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\" (UID: \"84c9b4a5-79e5-45a7-9de2-2dead3caddaa\") " Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.806937 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-kube-api-access-g2wrp" (OuterVolumeSpecName: "kube-api-access-g2wrp") pod "84c9b4a5-79e5-45a7-9de2-2dead3caddaa" (UID: "84c9b4a5-79e5-45a7-9de2-2dead3caddaa"). InnerVolumeSpecName "kube-api-access-g2wrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.829020 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "84c9b4a5-79e5-45a7-9de2-2dead3caddaa" (UID: "84c9b4a5-79e5-45a7-9de2-2dead3caddaa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.853143 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-inventory" (OuterVolumeSpecName: "inventory") pod "84c9b4a5-79e5-45a7-9de2-2dead3caddaa" (UID: "84c9b4a5-79e5-45a7-9de2-2dead3caddaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.903740 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2wrp\" (UniqueName: \"kubernetes.io/projected/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-kube-api-access-g2wrp\") on node \"crc\" DevicePath \"\"" Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.903774 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:19:50 crc kubenswrapper[4922]: I1122 03:19:50.903785 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/84c9b4a5-79e5-45a7-9de2-2dead3caddaa-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.284775 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" event={"ID":"84c9b4a5-79e5-45a7-9de2-2dead3caddaa","Type":"ContainerDied","Data":"1c659e038ddb235e0b2fc01f652867c8f9945f8d79e3c24fd1789fa91a0fac05"} Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.285187 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c659e038ddb235e0b2fc01f652867c8f9945f8d79e3c24fd1789fa91a0fac05" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.284822 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.379616 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59"] Nov 22 03:19:51 crc kubenswrapper[4922]: E1122 03:19:51.380333 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c9b4a5-79e5-45a7-9de2-2dead3caddaa" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.380362 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c9b4a5-79e5-45a7-9de2-2dead3caddaa" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.380608 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c9b4a5-79e5-45a7-9de2-2dead3caddaa" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.381379 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.388347 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59"] Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.390571 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.390927 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.391294 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.391571 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.415114 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.415211 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n49p9\" (UniqueName: \"kubernetes.io/projected/18e1091b-6626-426b-b6a8-235daff0df16-kube-api-access-n49p9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.415235 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.516534 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.517183 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n49p9\" (UniqueName: \"kubernetes.io/projected/18e1091b-6626-426b-b6a8-235daff0df16-kube-api-access-n49p9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.517211 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.522027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.524334 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.548697 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n49p9\" (UniqueName: \"kubernetes.io/projected/18e1091b-6626-426b-b6a8-235daff0df16-kube-api-access-n49p9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rjt59\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:51 crc kubenswrapper[4922]: I1122 03:19:51.715501 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:19:52 crc kubenswrapper[4922]: I1122 03:19:52.326709 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.040772 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-885b-account-create-79txk"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.054071 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d495-account-create-b2mz9"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.062618 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d495-account-create-b2mz9"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.076319 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-75a2-account-create-wwhfk"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.084227 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-885b-account-create-79txk"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.092021 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-75a2-account-create-wwhfk"] Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.306155 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:19:53 crc kubenswrapper[4922]: E1122 03:19:53.306711 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.324590 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d2f699-5508-477b-8f5f-a80feb9a10b3" path="/var/lib/kubelet/pods/46d2f699-5508-477b-8f5f-a80feb9a10b3/volumes" Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.325102 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f2e3f8-0b26-4534-a01c-f261d5048821" path="/var/lib/kubelet/pods/58f2e3f8-0b26-4534-a01c-f261d5048821/volumes" Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.325590 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79ad5eb-fe16-4595-a535-664d15aba98a" path="/var/lib/kubelet/pods/f79ad5eb-fe16-4595-a535-664d15aba98a/volumes" Nov 22 03:19:53 crc kubenswrapper[4922]: I1122 03:19:53.326167 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" event={"ID":"18e1091b-6626-426b-b6a8-235daff0df16","Type":"ContainerStarted","Data":"3da4b2f26ba50c8083a5c2343b06f5f51f98e284fc75e914bcebde6c37a58af8"} Nov 22 03:19:54 crc kubenswrapper[4922]: I1122 03:19:54.330962 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" event={"ID":"18e1091b-6626-426b-b6a8-235daff0df16","Type":"ContainerStarted","Data":"740122ed3033efba9715f21854139df85e5fcb7ff4bfe4c852641343b00ee83d"} Nov 22 03:19:54 crc kubenswrapper[4922]: I1122 03:19:54.366974 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" podStartSLOduration=2.383822189 podStartE2EDuration="3.366943341s" podCreationTimestamp="2025-11-22 03:19:51 +0000 UTC" firstStartedPulling="2025-11-22 03:19:52.342637509 +0000 UTC m=+1628.381159421" lastFinishedPulling="2025-11-22 03:19:53.325758671 +0000 UTC m=+1629.364280573" observedRunningTime="2025-11-22 03:19:54.357354651 +0000 UTC m=+1630.395876583" watchObservedRunningTime="2025-11-22 03:19:54.366943341 +0000 UTC m=+1630.405465263" Nov 22 03:19:55 crc kubenswrapper[4922]: I1122 03:19:55.808138 4922 scope.go:117] "RemoveContainer" containerID="92eeeddc27b7ed39fe0b4dc384d0856e9fb8ad2ddcf7e4b7de28128b21716052" Nov 22 03:19:55 crc kubenswrapper[4922]: I1122 03:19:55.851653 4922 scope.go:117] "RemoveContainer" containerID="4c7f81b5ea191cceb9b242326dd29bc72cc51fde6a7951d5413fd6ca3adf1014" Nov 22 03:19:55 crc kubenswrapper[4922]: I1122 03:19:55.882934 4922 scope.go:117] "RemoveContainer" containerID="947f4126acb81a2bc855f69d83ba89ea605517b75ecf5e91129ba122f0b2bacc" Nov 22 03:19:55 crc kubenswrapper[4922]: I1122 03:19:55.964767 4922 scope.go:117] "RemoveContainer" containerID="48cd9d37c55ad31a07001d2ba0b3bd54cc22b5f04ed56109cdd74c1c4fd1ce72" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.016929 4922 scope.go:117] "RemoveContainer" containerID="c26cbcb5bd09e0351e31b7719f82cb707ea4eee4a590a12f03b8a2dd21c2a50a" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.035311 4922 scope.go:117] "RemoveContainer" containerID="e7d50cbb939cbca00d444abe24b5d0f1baec9dd1c77c8a4efd56beeed4c8fc1a" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.075647 4922 scope.go:117] "RemoveContainer" containerID="dd7717deb4676901623e27b92d3b402962399d118d9d436460639dfde979039b" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.090732 4922 scope.go:117] "RemoveContainer" containerID="733df6deffe6c0f591959831fef98d9c16b0695ce44cb6c1dab2f05e439ad628" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.115021 4922 scope.go:117] "RemoveContainer" containerID="64a9f8862fe435f716cbb7864894e6a62685c6f5fb328a4321795a1d45c5163b" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.146482 4922 scope.go:117] "RemoveContainer" containerID="49af4be13e64c37e69ecbd10922c5af17213be191c4fe4282c7036a69fc06404" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.163381 4922 scope.go:117] "RemoveContainer" containerID="fd4a3511c40b414c022fc64007fb1bab9ce0ba88acb96abb737da96793fa8c60" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.181099 4922 scope.go:117] "RemoveContainer" containerID="28c9cac5320bb3f2a7604588d4cd72ee5f68a9ac3641bfc994efc50207765f38" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.197443 4922 scope.go:117] "RemoveContainer" containerID="462eae1408c2e75c5b1f93d877c6d0cdf496380386ebcd7908aecd7b59c5d37b" Nov 22 03:19:56 crc kubenswrapper[4922]: I1122 03:19:56.224183 4922 scope.go:117] "RemoveContainer" containerID="9bfa856b03cc6b8d7cbb4166791984228491a1e1796b4eb45193c2edd4ac51ea" Nov 22 03:20:00 crc kubenswrapper[4922]: I1122 03:20:00.045618 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jg47c"] Nov 22 03:20:00 crc kubenswrapper[4922]: I1122 03:20:00.063461 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jg47c"] Nov 22 03:20:01 crc kubenswrapper[4922]: I1122 03:20:01.032993 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-whfxs"] Nov 22 03:20:01 crc kubenswrapper[4922]: I1122 03:20:01.041952 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-whfxs"] Nov 22 03:20:01 crc kubenswrapper[4922]: I1122 03:20:01.324090 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61948bb8-1797-44d7-946a-906a010895b6" path="/var/lib/kubelet/pods/61948bb8-1797-44d7-946a-906a010895b6/volumes" Nov 22 03:20:01 crc kubenswrapper[4922]: I1122 03:20:01.326426 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeeab218-54fe-4892-b3f8-60b166ad72e2" path="/var/lib/kubelet/pods/aeeab218-54fe-4892-b3f8-60b166ad72e2/volumes" Nov 22 03:20:04 crc kubenswrapper[4922]: I1122 03:20:04.301109 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:20:04 crc kubenswrapper[4922]: E1122 03:20:04.301605 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:20:15 crc kubenswrapper[4922]: I1122 03:20:15.311305 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:20:15 crc kubenswrapper[4922]: E1122 03:20:15.313154 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.821435 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2lqj8"] Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.827350 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.830085 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lqj8"] Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.846339 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrwq9\" (UniqueName: \"kubernetes.io/projected/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-kube-api-access-hrwq9\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.846508 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-utilities\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.846573 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-catalog-content\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.948235 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrwq9\" (UniqueName: \"kubernetes.io/projected/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-kube-api-access-hrwq9\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.948356 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-utilities\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.948399 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-catalog-content\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.949028 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-catalog-content\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.949208 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-utilities\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:21 crc kubenswrapper[4922]: I1122 03:20:21.970674 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrwq9\" (UniqueName: \"kubernetes.io/projected/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-kube-api-access-hrwq9\") pod \"community-operators-2lqj8\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:22 crc kubenswrapper[4922]: I1122 03:20:22.171312 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:22 crc kubenswrapper[4922]: I1122 03:20:22.683815 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2lqj8"] Nov 22 03:20:23 crc kubenswrapper[4922]: I1122 03:20:23.653662 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerID="c2452f784878c93b1e2a4690cf46c50737ec358962e4542cc7d889f72ed473bf" exitCode=0 Nov 22 03:20:23 crc kubenswrapper[4922]: I1122 03:20:23.653768 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lqj8" event={"ID":"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4","Type":"ContainerDied","Data":"c2452f784878c93b1e2a4690cf46c50737ec358962e4542cc7d889f72ed473bf"} Nov 22 03:20:23 crc kubenswrapper[4922]: I1122 03:20:23.654263 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lqj8" event={"ID":"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4","Type":"ContainerStarted","Data":"825e1aa671aadc1e125eee08be7417d969c75d7c4cad6c86c84f93d5a689ec49"} Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.597834 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76fmx"] Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.600071 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.615198 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fmx"] Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.657469 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-utilities\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.657612 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-catalog-content\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.657667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/ff6f8353-795d-4cc5-912e-99a8d2c11ace-kube-api-access-sj6m7\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.676200 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerID="228b3c999a820006b90c38e64c1e4cf1d0feed8e709763f5d5051770d34eb5f6" exitCode=0 Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.676249 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lqj8" event={"ID":"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4","Type":"ContainerDied","Data":"228b3c999a820006b90c38e64c1e4cf1d0feed8e709763f5d5051770d34eb5f6"} Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.759539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-catalog-content\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.759613 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/ff6f8353-795d-4cc5-912e-99a8d2c11ace-kube-api-access-sj6m7\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.759659 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-utilities\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.760186 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-utilities\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.760381 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-catalog-content\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.781099 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/ff6f8353-795d-4cc5-912e-99a8d2c11ace-kube-api-access-sj6m7\") pod \"redhat-marketplace-76fmx\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:25 crc kubenswrapper[4922]: I1122 03:20:25.919195 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:26 crc kubenswrapper[4922]: I1122 03:20:26.381403 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fmx"] Nov 22 03:20:26 crc kubenswrapper[4922]: W1122 03:20:26.393370 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6f8353_795d_4cc5_912e_99a8d2c11ace.slice/crio-8d63b9cf43714838704c0c7b0103ebeafbb60ff92da0b5c0926c0f094d41b3c8 WatchSource:0}: Error finding container 8d63b9cf43714838704c0c7b0103ebeafbb60ff92da0b5c0926c0f094d41b3c8: Status 404 returned error can't find the container with id 8d63b9cf43714838704c0c7b0103ebeafbb60ff92da0b5c0926c0f094d41b3c8 Nov 22 03:20:26 crc kubenswrapper[4922]: I1122 03:20:26.686216 4922 generic.go:334] "Generic (PLEG): container finished" podID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerID="70fd55c02b75bebc8d69d75ae33750612398729227bbad5260964c5cdf445aca" exitCode=0 Nov 22 03:20:26 crc kubenswrapper[4922]: I1122 03:20:26.686273 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fmx" event={"ID":"ff6f8353-795d-4cc5-912e-99a8d2c11ace","Type":"ContainerDied","Data":"70fd55c02b75bebc8d69d75ae33750612398729227bbad5260964c5cdf445aca"} Nov 22 03:20:26 crc kubenswrapper[4922]: I1122 03:20:26.686557 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fmx" event={"ID":"ff6f8353-795d-4cc5-912e-99a8d2c11ace","Type":"ContainerStarted","Data":"8d63b9cf43714838704c0c7b0103ebeafbb60ff92da0b5c0926c0f094d41b3c8"} Nov 22 03:20:26 crc kubenswrapper[4922]: I1122 03:20:26.692105 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lqj8" event={"ID":"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4","Type":"ContainerStarted","Data":"21f258264de6dfb0b3499fc15f916ca043316dcc9486d5ac8795fd4c7a2ec039"} Nov 22 03:20:26 crc kubenswrapper[4922]: I1122 03:20:26.743489 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2lqj8" podStartSLOduration=3.017579247 podStartE2EDuration="5.743463865s" podCreationTimestamp="2025-11-22 03:20:21 +0000 UTC" firstStartedPulling="2025-11-22 03:20:23.656139179 +0000 UTC m=+1659.694661101" lastFinishedPulling="2025-11-22 03:20:26.382023827 +0000 UTC m=+1662.420545719" observedRunningTime="2025-11-22 03:20:26.737335468 +0000 UTC m=+1662.775857360" watchObservedRunningTime="2025-11-22 03:20:26.743463865 +0000 UTC m=+1662.781985797" Nov 22 03:20:28 crc kubenswrapper[4922]: I1122 03:20:28.720283 4922 generic.go:334] "Generic (PLEG): container finished" podID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerID="2db2010d4b5d47ad0aab81b25f5fd2be45016632a85239d35e14ece7fbf9393c" exitCode=0 Nov 22 03:20:28 crc kubenswrapper[4922]: I1122 03:20:28.720340 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fmx" event={"ID":"ff6f8353-795d-4cc5-912e-99a8d2c11ace","Type":"ContainerDied","Data":"2db2010d4b5d47ad0aab81b25f5fd2be45016632a85239d35e14ece7fbf9393c"} Nov 22 03:20:29 crc kubenswrapper[4922]: I1122 03:20:29.300491 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:20:29 crc kubenswrapper[4922]: E1122 03:20:29.300773 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:20:29 crc kubenswrapper[4922]: I1122 03:20:29.731508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fmx" event={"ID":"ff6f8353-795d-4cc5-912e-99a8d2c11ace","Type":"ContainerStarted","Data":"ad3251f7e1b56a3da72b66b8644de99f20909a596d729e20e62a517417742025"} Nov 22 03:20:29 crc kubenswrapper[4922]: I1122 03:20:29.764983 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76fmx" podStartSLOduration=2.133034776 podStartE2EDuration="4.764959154s" podCreationTimestamp="2025-11-22 03:20:25 +0000 UTC" firstStartedPulling="2025-11-22 03:20:26.688538949 +0000 UTC m=+1662.727060841" lastFinishedPulling="2025-11-22 03:20:29.320463317 +0000 UTC m=+1665.358985219" observedRunningTime="2025-11-22 03:20:29.754058283 +0000 UTC m=+1665.792580205" watchObservedRunningTime="2025-11-22 03:20:29.764959154 +0000 UTC m=+1665.803481066" Nov 22 03:20:32 crc kubenswrapper[4922]: I1122 03:20:32.171603 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:32 crc kubenswrapper[4922]: I1122 03:20:32.171934 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:32 crc kubenswrapper[4922]: I1122 03:20:32.219592 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:32 crc kubenswrapper[4922]: I1122 03:20:32.816268 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:33 crc kubenswrapper[4922]: I1122 03:20:33.396370 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lqj8"] Nov 22 03:20:34 crc kubenswrapper[4922]: I1122 03:20:34.793905 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2lqj8" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="registry-server" containerID="cri-o://21f258264de6dfb0b3499fc15f916ca043316dcc9486d5ac8795fd4c7a2ec039" gracePeriod=2 Nov 22 03:20:35 crc kubenswrapper[4922]: I1122 03:20:35.802963 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerID="21f258264de6dfb0b3499fc15f916ca043316dcc9486d5ac8795fd4c7a2ec039" exitCode=0 Nov 22 03:20:35 crc kubenswrapper[4922]: I1122 03:20:35.803415 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lqj8" event={"ID":"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4","Type":"ContainerDied","Data":"21f258264de6dfb0b3499fc15f916ca043316dcc9486d5ac8795fd4c7a2ec039"} Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.463730 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.464447 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.528394 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-g8fzk"] Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.545008 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-g8fzk"] Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.549542 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.666231 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.764966 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-utilities\") pod \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.765561 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-catalog-content\") pod \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.765683 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-utilities" (OuterVolumeSpecName: "utilities") pod "9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" (UID: "9b94d91f-e0bc-46e0-954b-6e3e521b7aa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.765984 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrwq9\" (UniqueName: \"kubernetes.io/projected/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-kube-api-access-hrwq9\") pod \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\" (UID: \"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4\") " Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.766993 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.771516 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-kube-api-access-hrwq9" (OuterVolumeSpecName: "kube-api-access-hrwq9") pod "9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" (UID: "9b94d91f-e0bc-46e0-954b-6e3e521b7aa4"). InnerVolumeSpecName "kube-api-access-hrwq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.813808 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2lqj8" event={"ID":"9b94d91f-e0bc-46e0-954b-6e3e521b7aa4","Type":"ContainerDied","Data":"825e1aa671aadc1e125eee08be7417d969c75d7c4cad6c86c84f93d5a689ec49"} Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.813945 4922 scope.go:117] "RemoveContainer" containerID="21f258264de6dfb0b3499fc15f916ca043316dcc9486d5ac8795fd4c7a2ec039" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.814945 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2lqj8" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.851137 4922 scope.go:117] "RemoveContainer" containerID="228b3c999a820006b90c38e64c1e4cf1d0feed8e709763f5d5051770d34eb5f6" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.865663 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.869344 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrwq9\" (UniqueName: \"kubernetes.io/projected/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-kube-api-access-hrwq9\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:36 crc kubenswrapper[4922]: I1122 03:20:36.878934 4922 scope.go:117] "RemoveContainer" containerID="c2452f784878c93b1e2a4690cf46c50737ec358962e4542cc7d889f72ed473bf" Nov 22 03:20:37 crc kubenswrapper[4922]: I1122 03:20:37.311370 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb0a287-d346-47ce-9f23-64d1190b5516" path="/var/lib/kubelet/pods/ccb0a287-d346-47ce-9f23-64d1190b5516/volumes" Nov 22 03:20:37 crc kubenswrapper[4922]: I1122 03:20:37.697422 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" (UID: "9b94d91f-e0bc-46e0-954b-6e3e521b7aa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:20:37 crc kubenswrapper[4922]: I1122 03:20:37.766621 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2lqj8"] Nov 22 03:20:37 crc kubenswrapper[4922]: I1122 03:20:37.777446 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2lqj8"] Nov 22 03:20:37 crc kubenswrapper[4922]: I1122 03:20:37.789191 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:39 crc kubenswrapper[4922]: I1122 03:20:39.197437 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fmx"] Nov 22 03:20:39 crc kubenswrapper[4922]: I1122 03:20:39.320206 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" path="/var/lib/kubelet/pods/9b94d91f-e0bc-46e0-954b-6e3e521b7aa4/volumes" Nov 22 03:20:39 crc kubenswrapper[4922]: I1122 03:20:39.852280 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-76fmx" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="registry-server" containerID="cri-o://ad3251f7e1b56a3da72b66b8644de99f20909a596d729e20e62a517417742025" gracePeriod=2 Nov 22 03:20:40 crc kubenswrapper[4922]: I1122 03:20:40.301162 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:20:40 crc kubenswrapper[4922]: E1122 03:20:40.302074 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:20:40 crc kubenswrapper[4922]: I1122 03:20:40.864200 4922 generic.go:334] "Generic (PLEG): container finished" podID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerID="ad3251f7e1b56a3da72b66b8644de99f20909a596d729e20e62a517417742025" exitCode=0 Nov 22 03:20:40 crc kubenswrapper[4922]: I1122 03:20:40.864246 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fmx" event={"ID":"ff6f8353-795d-4cc5-912e-99a8d2c11ace","Type":"ContainerDied","Data":"ad3251f7e1b56a3da72b66b8644de99f20909a596d729e20e62a517417742025"} Nov 22 03:20:40 crc kubenswrapper[4922]: I1122 03:20:40.864276 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76fmx" event={"ID":"ff6f8353-795d-4cc5-912e-99a8d2c11ace","Type":"ContainerDied","Data":"8d63b9cf43714838704c0c7b0103ebeafbb60ff92da0b5c0926c0f094d41b3c8"} Nov 22 03:20:40 crc kubenswrapper[4922]: I1122 03:20:40.864289 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d63b9cf43714838704c0c7b0103ebeafbb60ff92da0b5c0926c0f094d41b3c8" Nov 22 03:20:40 crc kubenswrapper[4922]: I1122 03:20:40.890450 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.054976 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-utilities\") pod \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.055091 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-catalog-content\") pod \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.055137 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/ff6f8353-795d-4cc5-912e-99a8d2c11ace-kube-api-access-sj6m7\") pod \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\" (UID: \"ff6f8353-795d-4cc5-912e-99a8d2c11ace\") " Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.056289 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-utilities" (OuterVolumeSpecName: "utilities") pod "ff6f8353-795d-4cc5-912e-99a8d2c11ace" (UID: "ff6f8353-795d-4cc5-912e-99a8d2c11ace"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.064038 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6f8353-795d-4cc5-912e-99a8d2c11ace-kube-api-access-sj6m7" (OuterVolumeSpecName: "kube-api-access-sj6m7") pod "ff6f8353-795d-4cc5-912e-99a8d2c11ace" (UID: "ff6f8353-795d-4cc5-912e-99a8d2c11ace"). InnerVolumeSpecName "kube-api-access-sj6m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.076560 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff6f8353-795d-4cc5-912e-99a8d2c11ace" (UID: "ff6f8353-795d-4cc5-912e-99a8d2c11ace"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.157967 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.158015 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff6f8353-795d-4cc5-912e-99a8d2c11ace-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.158029 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj6m7\" (UniqueName: \"kubernetes.io/projected/ff6f8353-795d-4cc5-912e-99a8d2c11ace-kube-api-access-sj6m7\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.875384 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76fmx" Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.912316 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fmx"] Nov 22 03:20:41 crc kubenswrapper[4922]: I1122 03:20:41.921261 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-76fmx"] Nov 22 03:20:43 crc kubenswrapper[4922]: I1122 03:20:43.316752 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" path="/var/lib/kubelet/pods/ff6f8353-795d-4cc5-912e-99a8d2c11ace/volumes" Nov 22 03:20:48 crc kubenswrapper[4922]: I1122 03:20:48.055419 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-n6frp"] Nov 22 03:20:48 crc kubenswrapper[4922]: I1122 03:20:48.067467 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-n6frp"] Nov 22 03:20:48 crc kubenswrapper[4922]: I1122 03:20:48.950044 4922 generic.go:334] "Generic (PLEG): container finished" podID="18e1091b-6626-426b-b6a8-235daff0df16" containerID="740122ed3033efba9715f21854139df85e5fcb7ff4bfe4c852641343b00ee83d" exitCode=0 Nov 22 03:20:48 crc kubenswrapper[4922]: I1122 03:20:48.950170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" event={"ID":"18e1091b-6626-426b-b6a8-235daff0df16","Type":"ContainerDied","Data":"740122ed3033efba9715f21854139df85e5fcb7ff4bfe4c852641343b00ee83d"} Nov 22 03:20:49 crc kubenswrapper[4922]: I1122 03:20:49.323408 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c56a5a-f992-43bb-bcd0-15f23d824242" path="/var/lib/kubelet/pods/06c56a5a-f992-43bb-bcd0-15f23d824242/volumes" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.352226 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.451810 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-inventory\") pod \"18e1091b-6626-426b-b6a8-235daff0df16\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.451890 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n49p9\" (UniqueName: \"kubernetes.io/projected/18e1091b-6626-426b-b6a8-235daff0df16-kube-api-access-n49p9\") pod \"18e1091b-6626-426b-b6a8-235daff0df16\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.451936 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-ssh-key\") pod \"18e1091b-6626-426b-b6a8-235daff0df16\" (UID: \"18e1091b-6626-426b-b6a8-235daff0df16\") " Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.460036 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e1091b-6626-426b-b6a8-235daff0df16-kube-api-access-n49p9" (OuterVolumeSpecName: "kube-api-access-n49p9") pod "18e1091b-6626-426b-b6a8-235daff0df16" (UID: "18e1091b-6626-426b-b6a8-235daff0df16"). InnerVolumeSpecName "kube-api-access-n49p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.478125 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18e1091b-6626-426b-b6a8-235daff0df16" (UID: "18e1091b-6626-426b-b6a8-235daff0df16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.479244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-inventory" (OuterVolumeSpecName: "inventory") pod "18e1091b-6626-426b-b6a8-235daff0df16" (UID: "18e1091b-6626-426b-b6a8-235daff0df16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.553899 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.553935 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n49p9\" (UniqueName: \"kubernetes.io/projected/18e1091b-6626-426b-b6a8-235daff0df16-kube-api-access-n49p9\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.553949 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18e1091b-6626-426b-b6a8-235daff0df16-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.973497 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" event={"ID":"18e1091b-6626-426b-b6a8-235daff0df16","Type":"ContainerDied","Data":"3da4b2f26ba50c8083a5c2343b06f5f51f98e284fc75e914bcebde6c37a58af8"} Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.973550 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3da4b2f26ba50c8083a5c2343b06f5f51f98e284fc75e914bcebde6c37a58af8" Nov 22 03:20:50 crc kubenswrapper[4922]: I1122 03:20:50.973572 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.084432 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7pv4"] Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.084924 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="extract-utilities" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.084952 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="extract-utilities" Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.084980 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="extract-utilities" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.084991 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="extract-utilities" Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.085010 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="extract-content" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085021 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="extract-content" Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.085043 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="registry-server" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085054 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="registry-server" Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.085076 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="registry-server" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085087 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="registry-server" Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.085111 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="extract-content" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085121 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="extract-content" Nov 22 03:20:51 crc kubenswrapper[4922]: E1122 03:20:51.085143 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e1091b-6626-426b-b6a8-235daff0df16" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085156 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e1091b-6626-426b-b6a8-235daff0df16" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085427 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e1091b-6626-426b-b6a8-235daff0df16" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085463 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b94d91f-e0bc-46e0-954b-6e3e521b7aa4" containerName="registry-server" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.085505 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6f8353-795d-4cc5-912e-99a8d2c11ace" containerName="registry-server" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.086316 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.088823 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.088931 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.090277 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.091708 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.106759 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7pv4"] Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.274010 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvvxh\" (UniqueName: \"kubernetes.io/projected/d5be6a9a-d168-44b9-870c-821795a56953-kube-api-access-xvvxh\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.274160 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.274220 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.376268 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.376371 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.377468 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvvxh\" (UniqueName: \"kubernetes.io/projected/d5be6a9a-d168-44b9-870c-821795a56953-kube-api-access-xvvxh\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.387402 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.392535 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.409481 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvvxh\" (UniqueName: \"kubernetes.io/projected/d5be6a9a-d168-44b9-870c-821795a56953-kube-api-access-xvvxh\") pod \"ssh-known-hosts-edpm-deployment-b7pv4\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:51 crc kubenswrapper[4922]: I1122 03:20:51.426617 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:20:52 crc kubenswrapper[4922]: I1122 03:20:52.029602 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7pv4"] Nov 22 03:20:52 crc kubenswrapper[4922]: W1122 03:20:52.040828 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5be6a9a_d168_44b9_870c_821795a56953.slice/crio-19e2f34972add24f80643570393987cd4a7200a99332585a3f275140dabed917 WatchSource:0}: Error finding container 19e2f34972add24f80643570393987cd4a7200a99332585a3f275140dabed917: Status 404 returned error can't find the container with id 19e2f34972add24f80643570393987cd4a7200a99332585a3f275140dabed917 Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.007110 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" event={"ID":"d5be6a9a-d168-44b9-870c-821795a56953","Type":"ContainerStarted","Data":"19e2f34972add24f80643570393987cd4a7200a99332585a3f275140dabed917"} Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.046259 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-r2hv4"] Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.059900 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-n89zb"] Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.071743 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-8xvxh"] Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.082942 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-8xvxh"] Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.090477 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-r2hv4"] Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.108414 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-n89zb"] Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.319425 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b2290b-756b-4cbb-b1c1-19f32bdd6358" path="/var/lib/kubelet/pods/48b2290b-756b-4cbb-b1c1-19f32bdd6358/volumes" Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.320629 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d9bda50-78a2-45bf-a92a-f1085209b972" path="/var/lib/kubelet/pods/9d9bda50-78a2-45bf-a92a-f1085209b972/volumes" Nov 22 03:20:53 crc kubenswrapper[4922]: I1122 03:20:53.321560 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f5c234-6ae0-451b-bdca-3d6ba5f3a003" path="/var/lib/kubelet/pods/f1f5c234-6ae0-451b-bdca-3d6ba5f3a003/volumes" Nov 22 03:20:55 crc kubenswrapper[4922]: I1122 03:20:55.031568 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" event={"ID":"d5be6a9a-d168-44b9-870c-821795a56953","Type":"ContainerStarted","Data":"013183b52921fe7a0d8f7e585f502ebf6be9c678cab1a6781678251f1d9c0c9a"} Nov 22 03:20:55 crc kubenswrapper[4922]: I1122 03:20:55.057996 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" podStartSLOduration=3.280582173 podStartE2EDuration="4.057979477s" podCreationTimestamp="2025-11-22 03:20:51 +0000 UTC" firstStartedPulling="2025-11-22 03:20:52.043989844 +0000 UTC m=+1688.082511746" lastFinishedPulling="2025-11-22 03:20:52.821387138 +0000 UTC m=+1688.859909050" observedRunningTime="2025-11-22 03:20:55.052330232 +0000 UTC m=+1691.090852134" watchObservedRunningTime="2025-11-22 03:20:55.057979477 +0000 UTC m=+1691.096501369" Nov 22 03:20:55 crc kubenswrapper[4922]: I1122 03:20:55.306046 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:20:55 crc kubenswrapper[4922]: E1122 03:20:55.306347 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.513244 4922 scope.go:117] "RemoveContainer" containerID="04042aa32ebb1743c2e167accffad0efef37dcdf37331bf99fa94d37797805f2" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.543201 4922 scope.go:117] "RemoveContainer" containerID="743cbaf1ce9cefac103aa8037e0e5f75e0d4e54b1c337c3acd037c4accf1f912" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.622151 4922 scope.go:117] "RemoveContainer" containerID="1dc2dd7a65399cf1389ecb4193671302a4e8054c71d779ca62ba14a2803d2a12" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.648678 4922 scope.go:117] "RemoveContainer" containerID="e3bc6067184f3efb2a84330cb75a18afe157fb2750e59786630af62bfcae138b" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.689097 4922 scope.go:117] "RemoveContainer" containerID="4302cfd328a00636597ce7a7e73f4d00264193f316be0e7e3972893870af3c7f" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.719061 4922 scope.go:117] "RemoveContainer" containerID="0503a79cf0c11dd129c952e2c01f8d01603f051e8525895fa57b3995fef8239a" Nov 22 03:20:56 crc kubenswrapper[4922]: I1122 03:20:56.772821 4922 scope.go:117] "RemoveContainer" containerID="093fccbff9f62bce9938b23840a13dbd1808acd49654b8622bf6ceaf157a04f5" Nov 22 03:20:59 crc kubenswrapper[4922]: I1122 03:20:59.035170 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xgvlf"] Nov 22 03:20:59 crc kubenswrapper[4922]: I1122 03:20:59.044807 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xgvlf"] Nov 22 03:20:59 crc kubenswrapper[4922]: I1122 03:20:59.329182 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f256e75d-5ff4-4804-bbe6-058ef24fab04" path="/var/lib/kubelet/pods/f256e75d-5ff4-4804-bbe6-058ef24fab04/volumes" Nov 22 03:21:02 crc kubenswrapper[4922]: I1122 03:21:02.105906 4922 generic.go:334] "Generic (PLEG): container finished" podID="d5be6a9a-d168-44b9-870c-821795a56953" containerID="013183b52921fe7a0d8f7e585f502ebf6be9c678cab1a6781678251f1d9c0c9a" exitCode=0 Nov 22 03:21:02 crc kubenswrapper[4922]: I1122 03:21:02.105978 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" event={"ID":"d5be6a9a-d168-44b9-870c-821795a56953","Type":"ContainerDied","Data":"013183b52921fe7a0d8f7e585f502ebf6be9c678cab1a6781678251f1d9c0c9a"} Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.650326 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.745003 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-inventory-0\") pod \"d5be6a9a-d168-44b9-870c-821795a56953\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.745598 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvvxh\" (UniqueName: \"kubernetes.io/projected/d5be6a9a-d168-44b9-870c-821795a56953-kube-api-access-xvvxh\") pod \"d5be6a9a-d168-44b9-870c-821795a56953\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.745674 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-ssh-key-openstack-edpm-ipam\") pod \"d5be6a9a-d168-44b9-870c-821795a56953\" (UID: \"d5be6a9a-d168-44b9-870c-821795a56953\") " Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.760523 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5be6a9a-d168-44b9-870c-821795a56953-kube-api-access-xvvxh" (OuterVolumeSpecName: "kube-api-access-xvvxh") pod "d5be6a9a-d168-44b9-870c-821795a56953" (UID: "d5be6a9a-d168-44b9-870c-821795a56953"). InnerVolumeSpecName "kube-api-access-xvvxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.793491 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d5be6a9a-d168-44b9-870c-821795a56953" (UID: "d5be6a9a-d168-44b9-870c-821795a56953"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.814800 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d5be6a9a-d168-44b9-870c-821795a56953" (UID: "d5be6a9a-d168-44b9-870c-821795a56953"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.848194 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.848238 4922 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d5be6a9a-d168-44b9-870c-821795a56953-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:03 crc kubenswrapper[4922]: I1122 03:21:03.848252 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvvxh\" (UniqueName: \"kubernetes.io/projected/d5be6a9a-d168-44b9-870c-821795a56953-kube-api-access-xvvxh\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.133591 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" event={"ID":"d5be6a9a-d168-44b9-870c-821795a56953","Type":"ContainerDied","Data":"19e2f34972add24f80643570393987cd4a7200a99332585a3f275140dabed917"} Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.133669 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19e2f34972add24f80643570393987cd4a7200a99332585a3f275140dabed917" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.133702 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7pv4" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.252290 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr"] Nov 22 03:21:04 crc kubenswrapper[4922]: E1122 03:21:04.252781 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5be6a9a-d168-44b9-870c-821795a56953" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.252803 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5be6a9a-d168-44b9-870c-821795a56953" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.253060 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5be6a9a-d168-44b9-870c-821795a56953" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.253811 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.256311 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.256885 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.256970 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.257816 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.261173 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr"] Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.361791 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.361837 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.362019 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2n4j\" (UniqueName: \"kubernetes.io/projected/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-kube-api-access-d2n4j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.463984 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2n4j\" (UniqueName: \"kubernetes.io/projected/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-kube-api-access-d2n4j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.464215 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.464239 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.468569 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.470962 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.489960 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2n4j\" (UniqueName: \"kubernetes.io/projected/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-kube-api-access-d2n4j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fz2gr\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:04 crc kubenswrapper[4922]: I1122 03:21:04.580522 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:05 crc kubenswrapper[4922]: I1122 03:21:05.183274 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr"] Nov 22 03:21:06 crc kubenswrapper[4922]: I1122 03:21:06.156167 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" event={"ID":"02af1d5e-ef95-44c5-9c29-284d64bf6fa7","Type":"ContainerStarted","Data":"9c8a49702b75805572ccfe989ee2d27f32c3db1babfff320599b05a834832450"} Nov 22 03:21:06 crc kubenswrapper[4922]: I1122 03:21:06.156523 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" event={"ID":"02af1d5e-ef95-44c5-9c29-284d64bf6fa7","Type":"ContainerStarted","Data":"62e42df53eebc80fb9f5e4219b99e15452c927ba3681ed29389010d1629ba2b4"} Nov 22 03:21:06 crc kubenswrapper[4922]: I1122 03:21:06.181429 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" podStartSLOduration=1.5280724879999998 podStartE2EDuration="2.181410307s" podCreationTimestamp="2025-11-22 03:21:04 +0000 UTC" firstStartedPulling="2025-11-22 03:21:05.196273851 +0000 UTC m=+1701.234795773" lastFinishedPulling="2025-11-22 03:21:05.84961166 +0000 UTC m=+1701.888133592" observedRunningTime="2025-11-22 03:21:06.179336108 +0000 UTC m=+1702.217858040" watchObservedRunningTime="2025-11-22 03:21:06.181410307 +0000 UTC m=+1702.219932209" Nov 22 03:21:09 crc kubenswrapper[4922]: I1122 03:21:09.051498 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-7347-account-create-h8cg8"] Nov 22 03:21:09 crc kubenswrapper[4922]: I1122 03:21:09.060819 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1f35-account-create-rjd56"] Nov 22 03:21:09 crc kubenswrapper[4922]: I1122 03:21:09.072341 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-7347-account-create-h8cg8"] Nov 22 03:21:09 crc kubenswrapper[4922]: I1122 03:21:09.083499 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1f35-account-create-rjd56"] Nov 22 03:21:09 crc kubenswrapper[4922]: I1122 03:21:09.318657 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="345a62f6-ef27-4695-89bf-4a0f47e9b7c5" path="/var/lib/kubelet/pods/345a62f6-ef27-4695-89bf-4a0f47e9b7c5/volumes" Nov 22 03:21:09 crc kubenswrapper[4922]: I1122 03:21:09.319419 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5a72db7-2799-420c-a519-b87d52b4c2f8" path="/var/lib/kubelet/pods/b5a72db7-2799-420c-a519-b87d52b4c2f8/volumes" Nov 22 03:21:10 crc kubenswrapper[4922]: I1122 03:21:10.040361 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2bf9-account-create-4565f"] Nov 22 03:21:10 crc kubenswrapper[4922]: I1122 03:21:10.056761 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2bf9-account-create-4565f"] Nov 22 03:21:10 crc kubenswrapper[4922]: I1122 03:21:10.301176 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:21:10 crc kubenswrapper[4922]: E1122 03:21:10.301579 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:21:11 crc kubenswrapper[4922]: I1122 03:21:11.337469 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd67a834-6ea6-4a59-b57a-00add67daa32" path="/var/lib/kubelet/pods/fd67a834-6ea6-4a59-b57a-00add67daa32/volumes" Nov 22 03:21:15 crc kubenswrapper[4922]: I1122 03:21:15.294105 4922 generic.go:334] "Generic (PLEG): container finished" podID="02af1d5e-ef95-44c5-9c29-284d64bf6fa7" containerID="9c8a49702b75805572ccfe989ee2d27f32c3db1babfff320599b05a834832450" exitCode=0 Nov 22 03:21:15 crc kubenswrapper[4922]: I1122 03:21:15.294746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" event={"ID":"02af1d5e-ef95-44c5-9c29-284d64bf6fa7","Type":"ContainerDied","Data":"9c8a49702b75805572ccfe989ee2d27f32c3db1babfff320599b05a834832450"} Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.804649 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.928526 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2n4j\" (UniqueName: \"kubernetes.io/projected/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-kube-api-access-d2n4j\") pod \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.928785 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-inventory\") pod \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.928912 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-ssh-key\") pod \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\" (UID: \"02af1d5e-ef95-44c5-9c29-284d64bf6fa7\") " Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.933883 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-kube-api-access-d2n4j" (OuterVolumeSpecName: "kube-api-access-d2n4j") pod "02af1d5e-ef95-44c5-9c29-284d64bf6fa7" (UID: "02af1d5e-ef95-44c5-9c29-284d64bf6fa7"). InnerVolumeSpecName "kube-api-access-d2n4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.979943 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-inventory" (OuterVolumeSpecName: "inventory") pod "02af1d5e-ef95-44c5-9c29-284d64bf6fa7" (UID: "02af1d5e-ef95-44c5-9c29-284d64bf6fa7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:16 crc kubenswrapper[4922]: I1122 03:21:16.985783 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "02af1d5e-ef95-44c5-9c29-284d64bf6fa7" (UID: "02af1d5e-ef95-44c5-9c29-284d64bf6fa7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.042616 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.042653 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.042666 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2n4j\" (UniqueName: \"kubernetes.io/projected/02af1d5e-ef95-44c5-9c29-284d64bf6fa7-kube-api-access-d2n4j\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.323945 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" event={"ID":"02af1d5e-ef95-44c5-9c29-284d64bf6fa7","Type":"ContainerDied","Data":"62e42df53eebc80fb9f5e4219b99e15452c927ba3681ed29389010d1629ba2b4"} Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.324001 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e42df53eebc80fb9f5e4219b99e15452c927ba3681ed29389010d1629ba2b4" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.324075 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.388609 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f"] Nov 22 03:21:17 crc kubenswrapper[4922]: E1122 03:21:17.389329 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02af1d5e-ef95-44c5-9c29-284d64bf6fa7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.389364 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="02af1d5e-ef95-44c5-9c29-284d64bf6fa7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.389703 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="02af1d5e-ef95-44c5-9c29-284d64bf6fa7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.390656 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.393274 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.393586 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.393778 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.407928 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.429114 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f"] Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.457940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl89s\" (UniqueName: \"kubernetes.io/projected/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-kube-api-access-dl89s\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.458132 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.458282 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.560974 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.561355 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl89s\" (UniqueName: \"kubernetes.io/projected/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-kube-api-access-dl89s\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.561417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.568524 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.569462 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.580222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl89s\" (UniqueName: \"kubernetes.io/projected/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-kube-api-access-dl89s\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:17 crc kubenswrapper[4922]: I1122 03:21:17.709374 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:18 crc kubenswrapper[4922]: I1122 03:21:18.330538 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f"] Nov 22 03:21:19 crc kubenswrapper[4922]: I1122 03:21:19.362211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" event={"ID":"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f","Type":"ContainerStarted","Data":"455888457e17975f9ca76f91226a3a1ac328603fc70a02df3a36907917926db3"} Nov 22 03:21:20 crc kubenswrapper[4922]: I1122 03:21:20.377495 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" event={"ID":"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f","Type":"ContainerStarted","Data":"c92d4dd6b83d33abee218b32ceb05f0c7d9bee4a9f205131d40f95fe7ecc7cb9"} Nov 22 03:21:21 crc kubenswrapper[4922]: I1122 03:21:21.414356 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" podStartSLOduration=3.053030645 podStartE2EDuration="4.414326702s" podCreationTimestamp="2025-11-22 03:21:17 +0000 UTC" firstStartedPulling="2025-11-22 03:21:18.355631646 +0000 UTC m=+1714.394153538" lastFinishedPulling="2025-11-22 03:21:19.716927643 +0000 UTC m=+1715.755449595" observedRunningTime="2025-11-22 03:21:21.403904592 +0000 UTC m=+1717.442426494" watchObservedRunningTime="2025-11-22 03:21:21.414326702 +0000 UTC m=+1717.452848634" Nov 22 03:21:24 crc kubenswrapper[4922]: I1122 03:21:24.300961 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:21:24 crc kubenswrapper[4922]: E1122 03:21:24.302200 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:21:30 crc kubenswrapper[4922]: I1122 03:21:30.521586 4922 generic.go:334] "Generic (PLEG): container finished" podID="c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" containerID="c92d4dd6b83d33abee218b32ceb05f0c7d9bee4a9f205131d40f95fe7ecc7cb9" exitCode=0 Nov 22 03:21:30 crc kubenswrapper[4922]: I1122 03:21:30.521696 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" event={"ID":"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f","Type":"ContainerDied","Data":"c92d4dd6b83d33abee218b32ceb05f0c7d9bee4a9f205131d40f95fe7ecc7cb9"} Nov 22 03:21:31 crc kubenswrapper[4922]: I1122 03:21:31.981351 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.056394 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-ssh-key\") pod \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.056567 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl89s\" (UniqueName: \"kubernetes.io/projected/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-kube-api-access-dl89s\") pod \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.056761 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-inventory\") pod \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\" (UID: \"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f\") " Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.065280 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-kube-api-access-dl89s" (OuterVolumeSpecName: "kube-api-access-dl89s") pod "c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" (UID: "c86e2611-32d0-44dd-8fcd-c4f29fa58d4f"). InnerVolumeSpecName "kube-api-access-dl89s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.088418 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-inventory" (OuterVolumeSpecName: "inventory") pod "c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" (UID: "c86e2611-32d0-44dd-8fcd-c4f29fa58d4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.103612 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" (UID: "c86e2611-32d0-44dd-8fcd-c4f29fa58d4f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.159706 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.159765 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.159784 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl89s\" (UniqueName: \"kubernetes.io/projected/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f-kube-api-access-dl89s\") on node \"crc\" DevicePath \"\"" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.540728 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" event={"ID":"c86e2611-32d0-44dd-8fcd-c4f29fa58d4f","Type":"ContainerDied","Data":"455888457e17975f9ca76f91226a3a1ac328603fc70a02df3a36907917926db3"} Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.540778 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="455888457e17975f9ca76f91226a3a1ac328603fc70a02df3a36907917926db3" Nov 22 03:21:32 crc kubenswrapper[4922]: I1122 03:21:32.540902 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f" Nov 22 03:21:34 crc kubenswrapper[4922]: I1122 03:21:34.048525 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8fkq"] Nov 22 03:21:34 crc kubenswrapper[4922]: I1122 03:21:34.062602 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h8fkq"] Nov 22 03:21:35 crc kubenswrapper[4922]: I1122 03:21:35.320828 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0138cd-b295-44c3-9f0b-1d5de2c8d144" path="/var/lib/kubelet/pods/5b0138cd-b295-44c3-9f0b-1d5de2c8d144/volumes" Nov 22 03:21:37 crc kubenswrapper[4922]: I1122 03:21:37.301025 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:21:37 crc kubenswrapper[4922]: E1122 03:21:37.301729 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:21:52 crc kubenswrapper[4922]: I1122 03:21:52.054823 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pcvkr"] Nov 22 03:21:52 crc kubenswrapper[4922]: I1122 03:21:52.061882 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pcvkr"] Nov 22 03:21:52 crc kubenswrapper[4922]: I1122 03:21:52.301362 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:21:52 crc kubenswrapper[4922]: E1122 03:21:52.301621 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:21:53 crc kubenswrapper[4922]: I1122 03:21:53.317778 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a6e29f-a80d-4f7d-8d32-c5a18f26da69" path="/var/lib/kubelet/pods/29a6e29f-a80d-4f7d-8d32-c5a18f26da69/volumes" Nov 22 03:21:56 crc kubenswrapper[4922]: I1122 03:21:56.923794 4922 scope.go:117] "RemoveContainer" containerID="559563baf27b35c4b2d7025184e49acec34716dab61b7c583423892a2341c9e8" Nov 22 03:21:56 crc kubenswrapper[4922]: I1122 03:21:56.966132 4922 scope.go:117] "RemoveContainer" containerID="9e02d916b38320aa828bed472018c9efb01ca7e640be52fb2798cd924c7eaefe" Nov 22 03:21:57 crc kubenswrapper[4922]: I1122 03:21:57.028751 4922 scope.go:117] "RemoveContainer" containerID="98e5df228c56af426f3fb084ade9b55926281ad7bf69f0c8360a68430e3b9a92" Nov 22 03:21:57 crc kubenswrapper[4922]: I1122 03:21:57.083182 4922 scope.go:117] "RemoveContainer" containerID="59f0c8464dddcc2b9c2e1d38c8cb1d2d0a5a31fff9e78685924efb86600644f7" Nov 22 03:21:57 crc kubenswrapper[4922]: I1122 03:21:57.134616 4922 scope.go:117] "RemoveContainer" containerID="4d9b604e74096d115ac988a940295dc538891acaadf8c4dca4ccb3e2d3e1f9c6" Nov 22 03:21:57 crc kubenswrapper[4922]: I1122 03:21:57.182295 4922 scope.go:117] "RemoveContainer" containerID="f33b53f139dbde89a912e9be4386086bd325c48d41bd800a6e12e7136709eb46" Nov 22 03:21:58 crc kubenswrapper[4922]: I1122 03:21:58.039973 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q9xlg"] Nov 22 03:21:58 crc kubenswrapper[4922]: I1122 03:21:58.057634 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-q9xlg"] Nov 22 03:21:59 crc kubenswrapper[4922]: I1122 03:21:59.317125 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255e57e6-0d59-4288-9070-373e6ce3d77c" path="/var/lib/kubelet/pods/255e57e6-0d59-4288-9070-373e6ce3d77c/volumes" Nov 22 03:22:04 crc kubenswrapper[4922]: I1122 03:22:04.301007 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:22:04 crc kubenswrapper[4922]: E1122 03:22:04.301987 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:22:15 crc kubenswrapper[4922]: I1122 03:22:15.316696 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:22:15 crc kubenswrapper[4922]: E1122 03:22:15.317728 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:22:28 crc kubenswrapper[4922]: I1122 03:22:28.301031 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:22:28 crc kubenswrapper[4922]: E1122 03:22:28.302393 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:22:38 crc kubenswrapper[4922]: I1122 03:22:38.053334 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vj8vq"] Nov 22 03:22:38 crc kubenswrapper[4922]: I1122 03:22:38.065130 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vj8vq"] Nov 22 03:22:39 crc kubenswrapper[4922]: I1122 03:22:39.313164 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c79e72-6196-4e01-b0a5-8baf2bafbb0f" path="/var/lib/kubelet/pods/a3c79e72-6196-4e01-b0a5-8baf2bafbb0f/volumes" Nov 22 03:22:41 crc kubenswrapper[4922]: I1122 03:22:41.300632 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:22:42 crc kubenswrapper[4922]: I1122 03:22:42.311228 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"867c06d58af11fe828ea1cbbaee07531aff1fe64bb6f347eacb739cad746f777"} Nov 22 03:22:57 crc kubenswrapper[4922]: I1122 03:22:57.317262 4922 scope.go:117] "RemoveContainer" containerID="3c04f5fbc0ec3d803529fc07fd9b51790a8ab8d7deebcaad32653894f1db4e27" Nov 22 03:22:57 crc kubenswrapper[4922]: I1122 03:22:57.369258 4922 scope.go:117] "RemoveContainer" containerID="a51596700e921c5bf64f9df1f6bec6eaf8aa3514a2ec071a771ec3ee7801dd91" Nov 22 03:24:41 crc kubenswrapper[4922]: I1122 03:24:41.110505 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:24:41 crc kubenswrapper[4922]: I1122 03:24:41.111179 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:25:11 crc kubenswrapper[4922]: I1122 03:25:11.110047 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:25:11 crc kubenswrapper[4922]: I1122 03:25:11.110667 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:25:41 crc kubenswrapper[4922]: I1122 03:25:41.110320 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:25:41 crc kubenswrapper[4922]: I1122 03:25:41.110959 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:25:41 crc kubenswrapper[4922]: I1122 03:25:41.111029 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:25:41 crc kubenswrapper[4922]: I1122 03:25:41.112140 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"867c06d58af11fe828ea1cbbaee07531aff1fe64bb6f347eacb739cad746f777"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:25:41 crc kubenswrapper[4922]: I1122 03:25:41.112245 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://867c06d58af11fe828ea1cbbaee07531aff1fe64bb6f347eacb739cad746f777" gracePeriod=600 Nov 22 03:25:42 crc kubenswrapper[4922]: I1122 03:25:42.261472 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="867c06d58af11fe828ea1cbbaee07531aff1fe64bb6f347eacb739cad746f777" exitCode=0 Nov 22 03:25:42 crc kubenswrapper[4922]: I1122 03:25:42.261669 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"867c06d58af11fe828ea1cbbaee07531aff1fe64bb6f347eacb739cad746f777"} Nov 22 03:25:42 crc kubenswrapper[4922]: I1122 03:25:42.264066 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78"} Nov 22 03:25:42 crc kubenswrapper[4922]: I1122 03:25:42.264150 4922 scope.go:117] "RemoveContainer" containerID="80e63ed196d529fe9e5c2ffb57fde09ab72342822511a120b7e3fb0cefdec1eb" Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.736049 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.749465 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pv6f7"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.760811 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.771090 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.778769 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4nd6w"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.787422 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.792816 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbmp5"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.798906 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.804711 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.810093 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-lzs9m"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.817903 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.820420 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fz2gr"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.825516 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.830469 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7pv4"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.835303 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lwrsk"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.840123 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-hk27f"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.844888 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7pv4"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.849529 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-sklsz"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.857035 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59"] Nov 22 03:26:14 crc kubenswrapper[4922]: I1122 03:26:14.859822 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rjt59"] Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.313674 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c85480-509f-45fa-b81f-2e29ba749afb" path="/var/lib/kubelet/pods/01c85480-509f-45fa-b81f-2e29ba749afb/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.314489 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02af1d5e-ef95-44c5-9c29-284d64bf6fa7" path="/var/lib/kubelet/pods/02af1d5e-ef95-44c5-9c29-284d64bf6fa7/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.315097 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e1091b-6626-426b-b6a8-235daff0df16" path="/var/lib/kubelet/pods/18e1091b-6626-426b-b6a8-235daff0df16/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.315739 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2021c745-dc35-460d-abfd-c15cab66eea7" path="/var/lib/kubelet/pods/2021c745-dc35-460d-abfd-c15cab66eea7/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.317012 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203fb5ca-ff37-4321-a532-1fe2103cc82d" path="/var/lib/kubelet/pods/203fb5ca-ff37-4321-a532-1fe2103cc82d/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.317501 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688b25c4-de7f-4adb-bc7e-760847d2a6e2" path="/var/lib/kubelet/pods/688b25c4-de7f-4adb-bc7e-760847d2a6e2/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.318015 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c9b4a5-79e5-45a7-9de2-2dead3caddaa" path="/var/lib/kubelet/pods/84c9b4a5-79e5-45a7-9de2-2dead3caddaa/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.319002 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" path="/var/lib/kubelet/pods/c86e2611-32d0-44dd-8fcd-c4f29fa58d4f/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.319488 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd426f86-ec43-4a7a-bde1-0430f01503f6" path="/var/lib/kubelet/pods/cd426f86-ec43-4a7a-bde1-0430f01503f6/volumes" Nov 22 03:26:15 crc kubenswrapper[4922]: I1122 03:26:15.320025 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5be6a9a-d168-44b9-870c-821795a56953" path="/var/lib/kubelet/pods/d5be6a9a-d168-44b9-870c-821795a56953/volumes" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.003440 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt"] Nov 22 03:26:21 crc kubenswrapper[4922]: E1122 03:26:21.004569 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.004592 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.004950 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86e2611-32d0-44dd-8fcd-c4f29fa58d4f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.005836 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.009337 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.009448 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.011925 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.011990 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.014175 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.060930 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt"] Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.073479 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2d2k\" (UniqueName: \"kubernetes.io/projected/a86d36d0-b5ca-4b97-89b6-3af2f942140f-kube-api-access-l2d2k\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.073562 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.073757 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.073889 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.074027 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.175551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2d2k\" (UniqueName: \"kubernetes.io/projected/a86d36d0-b5ca-4b97-89b6-3af2f942140f-kube-api-access-l2d2k\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.175614 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.175662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.175695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.175748 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.182129 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.183826 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.189467 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.190350 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.200338 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2d2k\" (UniqueName: \"kubernetes.io/projected/a86d36d0-b5ca-4b97-89b6-3af2f942140f-kube-api-access-l2d2k\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.361854 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.964962 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt"] Nov 22 03:26:21 crc kubenswrapper[4922]: I1122 03:26:21.974160 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:26:22 crc kubenswrapper[4922]: I1122 03:26:22.755148 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" event={"ID":"a86d36d0-b5ca-4b97-89b6-3af2f942140f","Type":"ContainerStarted","Data":"c73e00aff69f5ad947decc008e955c309a649245bb0bbffba48e09aafa16eacd"} Nov 22 03:26:23 crc kubenswrapper[4922]: I1122 03:26:23.768361 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" event={"ID":"a86d36d0-b5ca-4b97-89b6-3af2f942140f","Type":"ContainerStarted","Data":"a2748e998794f0fbc62784306654b711998f05113032d12eab39bdb714b9af86"} Nov 22 03:26:24 crc kubenswrapper[4922]: I1122 03:26:24.808757 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" podStartSLOduration=3.620209344 podStartE2EDuration="4.808730556s" podCreationTimestamp="2025-11-22 03:26:20 +0000 UTC" firstStartedPulling="2025-11-22 03:26:21.973912018 +0000 UTC m=+2018.012433910" lastFinishedPulling="2025-11-22 03:26:23.16243319 +0000 UTC m=+2019.200955122" observedRunningTime="2025-11-22 03:26:24.802554407 +0000 UTC m=+2020.841076329" watchObservedRunningTime="2025-11-22 03:26:24.808730556 +0000 UTC m=+2020.847252478" Nov 22 03:26:34 crc kubenswrapper[4922]: I1122 03:26:34.912218 4922 generic.go:334] "Generic (PLEG): container finished" podID="a86d36d0-b5ca-4b97-89b6-3af2f942140f" containerID="a2748e998794f0fbc62784306654b711998f05113032d12eab39bdb714b9af86" exitCode=0 Nov 22 03:26:34 crc kubenswrapper[4922]: I1122 03:26:34.912357 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" event={"ID":"a86d36d0-b5ca-4b97-89b6-3af2f942140f","Type":"ContainerDied","Data":"a2748e998794f0fbc62784306654b711998f05113032d12eab39bdb714b9af86"} Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.610763 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.778928 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2d2k\" (UniqueName: \"kubernetes.io/projected/a86d36d0-b5ca-4b97-89b6-3af2f942140f-kube-api-access-l2d2k\") pod \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.778992 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-repo-setup-combined-ca-bundle\") pod \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.779010 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-inventory\") pod \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.779076 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ssh-key\") pod \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.779232 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ceph\") pod \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\" (UID: \"a86d36d0-b5ca-4b97-89b6-3af2f942140f\") " Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.784831 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86d36d0-b5ca-4b97-89b6-3af2f942140f-kube-api-access-l2d2k" (OuterVolumeSpecName: "kube-api-access-l2d2k") pod "a86d36d0-b5ca-4b97-89b6-3af2f942140f" (UID: "a86d36d0-b5ca-4b97-89b6-3af2f942140f"). InnerVolumeSpecName "kube-api-access-l2d2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.785167 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a86d36d0-b5ca-4b97-89b6-3af2f942140f" (UID: "a86d36d0-b5ca-4b97-89b6-3af2f942140f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.785921 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ceph" (OuterVolumeSpecName: "ceph") pod "a86d36d0-b5ca-4b97-89b6-3af2f942140f" (UID: "a86d36d0-b5ca-4b97-89b6-3af2f942140f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.812448 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-inventory" (OuterVolumeSpecName: "inventory") pod "a86d36d0-b5ca-4b97-89b6-3af2f942140f" (UID: "a86d36d0-b5ca-4b97-89b6-3af2f942140f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.814023 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a86d36d0-b5ca-4b97-89b6-3af2f942140f" (UID: "a86d36d0-b5ca-4b97-89b6-3af2f942140f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.882357 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.882403 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2d2k\" (UniqueName: \"kubernetes.io/projected/a86d36d0-b5ca-4b97-89b6-3af2f942140f-kube-api-access-l2d2k\") on node \"crc\" DevicePath \"\"" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.882420 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.882435 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:26:37 crc kubenswrapper[4922]: I1122 03:26:37.882448 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a86d36d0-b5ca-4b97-89b6-3af2f942140f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.243316 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" event={"ID":"a86d36d0-b5ca-4b97-89b6-3af2f942140f","Type":"ContainerDied","Data":"c73e00aff69f5ad947decc008e955c309a649245bb0bbffba48e09aafa16eacd"} Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.243715 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c73e00aff69f5ad947decc008e955c309a649245bb0bbffba48e09aafa16eacd" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.243401 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.284641 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s"] Nov 22 03:26:38 crc kubenswrapper[4922]: E1122 03:26:38.285456 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86d36d0-b5ca-4b97-89b6-3af2f942140f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.285493 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86d36d0-b5ca-4b97-89b6-3af2f942140f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.285896 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86d36d0-b5ca-4b97-89b6-3af2f942140f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.291127 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.296708 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.297038 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.297651 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.297790 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.298910 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.300174 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s"] Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.393812 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.393955 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.394174 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.394209 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.394240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4frh\" (UniqueName: \"kubernetes.io/projected/9182213e-b8af-4a1b-96a8-ba3439d98d8a-kube-api-access-g4frh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.496414 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.496566 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.496712 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.496751 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.496795 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4frh\" (UniqueName: \"kubernetes.io/projected/9182213e-b8af-4a1b-96a8-ba3439d98d8a-kube-api-access-g4frh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.502073 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.502722 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.506784 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.511378 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.518482 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4frh\" (UniqueName: \"kubernetes.io/projected/9182213e-b8af-4a1b-96a8-ba3439d98d8a-kube-api-access-g4frh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:38 crc kubenswrapper[4922]: I1122 03:26:38.613476 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:26:39 crc kubenswrapper[4922]: I1122 03:26:39.225298 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s"] Nov 22 03:26:39 crc kubenswrapper[4922]: I1122 03:26:39.259915 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" event={"ID":"9182213e-b8af-4a1b-96a8-ba3439d98d8a","Type":"ContainerStarted","Data":"0bbacdc049399085925a6aa5ba1a6815761b240229bbe06912450d2e2b166b85"} Nov 22 03:26:41 crc kubenswrapper[4922]: I1122 03:26:41.297072 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" event={"ID":"9182213e-b8af-4a1b-96a8-ba3439d98d8a","Type":"ContainerStarted","Data":"ad50add233b394775817075b18274d938d4fe35fcf2e4ea7772b09074b78e52a"} Nov 22 03:26:41 crc kubenswrapper[4922]: I1122 03:26:41.324966 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" podStartSLOduration=2.247587888 podStartE2EDuration="3.324950353s" podCreationTimestamp="2025-11-22 03:26:38 +0000 UTC" firstStartedPulling="2025-11-22 03:26:39.238625003 +0000 UTC m=+2035.277146895" lastFinishedPulling="2025-11-22 03:26:40.315987428 +0000 UTC m=+2036.354509360" observedRunningTime="2025-11-22 03:26:41.324706238 +0000 UTC m=+2037.363228200" watchObservedRunningTime="2025-11-22 03:26:41.324950353 +0000 UTC m=+2037.363472245" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.587093 4922 scope.go:117] "RemoveContainer" containerID="2db2010d4b5d47ad0aab81b25f5fd2be45016632a85239d35e14ece7fbf9393c" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.624651 4922 scope.go:117] "RemoveContainer" containerID="c26720a596a2ae976d134fbbc1aec3552ffeaa26bc38f1a607940d6c41abb28f" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.737402 4922 scope.go:117] "RemoveContainer" containerID="ad3251f7e1b56a3da72b66b8644de99f20909a596d729e20e62a517417742025" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.765725 4922 scope.go:117] "RemoveContainer" containerID="740122ed3033efba9715f21854139df85e5fcb7ff4bfe4c852641343b00ee83d" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.848158 4922 scope.go:117] "RemoveContainer" containerID="99a44317093e19b689586028026fa9423290de32beb4f25617229d9042831666" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.891610 4922 scope.go:117] "RemoveContainer" containerID="6a8371ea7583bc06b848889b966ef4a937fce6d6d52a83154c6cb9d1e5594b38" Nov 22 03:26:57 crc kubenswrapper[4922]: I1122 03:26:57.967497 4922 scope.go:117] "RemoveContainer" containerID="3a85ad8b51f31332fd831641266240e37c3359dd9a4b13805b1f702ef439c4aa" Nov 22 03:26:58 crc kubenswrapper[4922]: I1122 03:26:58.038346 4922 scope.go:117] "RemoveContainer" containerID="013183b52921fe7a0d8f7e585f502ebf6be9c678cab1a6781678251f1d9c0c9a" Nov 22 03:26:58 crc kubenswrapper[4922]: I1122 03:26:58.070947 4922 scope.go:117] "RemoveContainer" containerID="70fd55c02b75bebc8d69d75ae33750612398729227bbad5260964c5cdf445aca" Nov 22 03:26:58 crc kubenswrapper[4922]: I1122 03:26:58.088160 4922 scope.go:117] "RemoveContainer" containerID="56fcdcf41c6041a000a2cb5a1d07bc26282abd2b863118552cfe31e5818ffd0b" Nov 22 03:26:58 crc kubenswrapper[4922]: I1122 03:26:58.123078 4922 scope.go:117] "RemoveContainer" containerID="e3523a0b0c3b2ae38bbd18cd6ffd4b235fb64fadcc0d23ad6c95b92f02cd7a00" Nov 22 03:27:58 crc kubenswrapper[4922]: I1122 03:27:58.293934 4922 scope.go:117] "RemoveContainer" containerID="c92d4dd6b83d33abee218b32ceb05f0c7d9bee4a9f205131d40f95fe7ecc7cb9" Nov 22 03:27:58 crc kubenswrapper[4922]: I1122 03:27:58.352096 4922 scope.go:117] "RemoveContainer" containerID="9c8a49702b75805572ccfe989ee2d27f32c3db1babfff320599b05a834832450" Nov 22 03:28:11 crc kubenswrapper[4922]: I1122 03:28:11.116552 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:28:11 crc kubenswrapper[4922]: I1122 03:28:11.117428 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:28:20 crc kubenswrapper[4922]: E1122 03:28:20.123158 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9182213e_b8af_4a1b_96a8_ba3439d98d8a.slice/crio-ad50add233b394775817075b18274d938d4fe35fcf2e4ea7772b09074b78e52a.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:28:20 crc kubenswrapper[4922]: I1122 03:28:20.310002 4922 generic.go:334] "Generic (PLEG): container finished" podID="9182213e-b8af-4a1b-96a8-ba3439d98d8a" containerID="ad50add233b394775817075b18274d938d4fe35fcf2e4ea7772b09074b78e52a" exitCode=0 Nov 22 03:28:20 crc kubenswrapper[4922]: I1122 03:28:20.310144 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" event={"ID":"9182213e-b8af-4a1b-96a8-ba3439d98d8a","Type":"ContainerDied","Data":"ad50add233b394775817075b18274d938d4fe35fcf2e4ea7772b09074b78e52a"} Nov 22 03:28:21 crc kubenswrapper[4922]: I1122 03:28:21.824291 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.014431 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ssh-key\") pod \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.014487 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-inventory\") pod \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.014569 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ceph\") pod \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.014605 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4frh\" (UniqueName: \"kubernetes.io/projected/9182213e-b8af-4a1b-96a8-ba3439d98d8a-kube-api-access-g4frh\") pod \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.014720 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-bootstrap-combined-ca-bundle\") pod \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\" (UID: \"9182213e-b8af-4a1b-96a8-ba3439d98d8a\") " Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.024080 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9182213e-b8af-4a1b-96a8-ba3439d98d8a" (UID: "9182213e-b8af-4a1b-96a8-ba3439d98d8a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.024206 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ceph" (OuterVolumeSpecName: "ceph") pod "9182213e-b8af-4a1b-96a8-ba3439d98d8a" (UID: "9182213e-b8af-4a1b-96a8-ba3439d98d8a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.026244 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9182213e-b8af-4a1b-96a8-ba3439d98d8a-kube-api-access-g4frh" (OuterVolumeSpecName: "kube-api-access-g4frh") pod "9182213e-b8af-4a1b-96a8-ba3439d98d8a" (UID: "9182213e-b8af-4a1b-96a8-ba3439d98d8a"). InnerVolumeSpecName "kube-api-access-g4frh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.064302 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9182213e-b8af-4a1b-96a8-ba3439d98d8a" (UID: "9182213e-b8af-4a1b-96a8-ba3439d98d8a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.074352 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-inventory" (OuterVolumeSpecName: "inventory") pod "9182213e-b8af-4a1b-96a8-ba3439d98d8a" (UID: "9182213e-b8af-4a1b-96a8-ba3439d98d8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.118220 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.118269 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.118287 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.118307 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4frh\" (UniqueName: \"kubernetes.io/projected/9182213e-b8af-4a1b-96a8-ba3439d98d8a-kube-api-access-g4frh\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.118334 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9182213e-b8af-4a1b-96a8-ba3439d98d8a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.337950 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" event={"ID":"9182213e-b8af-4a1b-96a8-ba3439d98d8a","Type":"ContainerDied","Data":"0bbacdc049399085925a6aa5ba1a6815761b240229bbe06912450d2e2b166b85"} Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.338328 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bbacdc049399085925a6aa5ba1a6815761b240229bbe06912450d2e2b166b85" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.338029 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.456313 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh"] Nov 22 03:28:22 crc kubenswrapper[4922]: E1122 03:28:22.456704 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9182213e-b8af-4a1b-96a8-ba3439d98d8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.456727 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9182213e-b8af-4a1b-96a8-ba3439d98d8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.457016 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9182213e-b8af-4a1b-96a8-ba3439d98d8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.457715 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.460322 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.460373 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.460403 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.460641 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.461095 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.479677 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh"] Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.628951 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.629136 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hxr\" (UniqueName: \"kubernetes.io/projected/152a7129-aafa-4856-b959-18e7fb0d45e4-kube-api-access-r2hxr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.629210 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.629242 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.731674 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.731745 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.731932 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.732064 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hxr\" (UniqueName: \"kubernetes.io/projected/152a7129-aafa-4856-b959-18e7fb0d45e4-kube-api-access-r2hxr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.737779 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.739382 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.746809 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.751442 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hxr\" (UniqueName: \"kubernetes.io/projected/152a7129-aafa-4856-b959-18e7fb0d45e4-kube-api-access-r2hxr\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:22 crc kubenswrapper[4922]: I1122 03:28:22.793491 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:23 crc kubenswrapper[4922]: I1122 03:28:23.185990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh"] Nov 22 03:28:23 crc kubenswrapper[4922]: W1122 03:28:23.186704 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod152a7129_aafa_4856_b959_18e7fb0d45e4.slice/crio-0a31153c5d003fcb7e46435da692aa73883cfd00c4145b96a65a7eada5fc4808 WatchSource:0}: Error finding container 0a31153c5d003fcb7e46435da692aa73883cfd00c4145b96a65a7eada5fc4808: Status 404 returned error can't find the container with id 0a31153c5d003fcb7e46435da692aa73883cfd00c4145b96a65a7eada5fc4808 Nov 22 03:28:23 crc kubenswrapper[4922]: I1122 03:28:23.347520 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" event={"ID":"152a7129-aafa-4856-b959-18e7fb0d45e4","Type":"ContainerStarted","Data":"0a31153c5d003fcb7e46435da692aa73883cfd00c4145b96a65a7eada5fc4808"} Nov 22 03:28:24 crc kubenswrapper[4922]: I1122 03:28:24.360124 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" event={"ID":"152a7129-aafa-4856-b959-18e7fb0d45e4","Type":"ContainerStarted","Data":"712d19b5819e26e5b751b49dc13aa71c2047dd6299dc6e0921f50879161a7eb6"} Nov 22 03:28:24 crc kubenswrapper[4922]: I1122 03:28:24.390615 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" podStartSLOduration=1.8948988679999998 podStartE2EDuration="2.390592335s" podCreationTimestamp="2025-11-22 03:28:22 +0000 UTC" firstStartedPulling="2025-11-22 03:28:23.188686269 +0000 UTC m=+2139.227208171" lastFinishedPulling="2025-11-22 03:28:23.684379716 +0000 UTC m=+2139.722901638" observedRunningTime="2025-11-22 03:28:24.382002568 +0000 UTC m=+2140.420524470" watchObservedRunningTime="2025-11-22 03:28:24.390592335 +0000 UTC m=+2140.429114227" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.042182 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fk6kz"] Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.046793 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.053916 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-utilities\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.053982 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-catalog-content\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.054107 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9m9w\" (UniqueName: \"kubernetes.io/projected/01d79e84-3dcf-4285-b687-59034fc3120c-kube-api-access-r9m9w\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.059983 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fk6kz"] Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.156234 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-utilities\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.156309 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-catalog-content\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.156361 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9m9w\" (UniqueName: \"kubernetes.io/projected/01d79e84-3dcf-4285-b687-59034fc3120c-kube-api-access-r9m9w\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.157459 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-catalog-content\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.157773 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-utilities\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.189122 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9m9w\" (UniqueName: \"kubernetes.io/projected/01d79e84-3dcf-4285-b687-59034fc3120c-kube-api-access-r9m9w\") pod \"redhat-operators-fk6kz\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.400438 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:33 crc kubenswrapper[4922]: I1122 03:28:33.877614 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fk6kz"] Nov 22 03:28:34 crc kubenswrapper[4922]: I1122 03:28:34.470955 4922 generic.go:334] "Generic (PLEG): container finished" podID="01d79e84-3dcf-4285-b687-59034fc3120c" containerID="d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7" exitCode=0 Nov 22 03:28:34 crc kubenswrapper[4922]: I1122 03:28:34.471004 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerDied","Data":"d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7"} Nov 22 03:28:34 crc kubenswrapper[4922]: I1122 03:28:34.471293 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerStarted","Data":"859cf472aa35b2eec5d4cb6e7a2ab325781178a06377be03bd7937968228e615"} Nov 22 03:28:36 crc kubenswrapper[4922]: I1122 03:28:36.512108 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerStarted","Data":"c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4"} Nov 22 03:28:39 crc kubenswrapper[4922]: I1122 03:28:39.553033 4922 generic.go:334] "Generic (PLEG): container finished" podID="01d79e84-3dcf-4285-b687-59034fc3120c" containerID="c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4" exitCode=0 Nov 22 03:28:39 crc kubenswrapper[4922]: I1122 03:28:39.553099 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerDied","Data":"c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4"} Nov 22 03:28:41 crc kubenswrapper[4922]: I1122 03:28:41.109656 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:28:41 crc kubenswrapper[4922]: I1122 03:28:41.110214 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:28:41 crc kubenswrapper[4922]: I1122 03:28:41.575059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerStarted","Data":"2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41"} Nov 22 03:28:41 crc kubenswrapper[4922]: I1122 03:28:41.612909 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fk6kz" podStartSLOduration=2.545496987 podStartE2EDuration="8.612880669s" podCreationTimestamp="2025-11-22 03:28:33 +0000 UTC" firstStartedPulling="2025-11-22 03:28:34.473605004 +0000 UTC m=+2150.512126906" lastFinishedPulling="2025-11-22 03:28:40.540988656 +0000 UTC m=+2156.579510588" observedRunningTime="2025-11-22 03:28:41.604090587 +0000 UTC m=+2157.642612499" watchObservedRunningTime="2025-11-22 03:28:41.612880669 +0000 UTC m=+2157.651402571" Nov 22 03:28:43 crc kubenswrapper[4922]: I1122 03:28:43.400807 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:43 crc kubenswrapper[4922]: I1122 03:28:43.401297 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:44 crc kubenswrapper[4922]: I1122 03:28:44.460768 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fk6kz" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:28:44 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:28:44 crc kubenswrapper[4922]: > Nov 22 03:28:51 crc kubenswrapper[4922]: I1122 03:28:51.694104 4922 generic.go:334] "Generic (PLEG): container finished" podID="152a7129-aafa-4856-b959-18e7fb0d45e4" containerID="712d19b5819e26e5b751b49dc13aa71c2047dd6299dc6e0921f50879161a7eb6" exitCode=0 Nov 22 03:28:51 crc kubenswrapper[4922]: I1122 03:28:51.695155 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" event={"ID":"152a7129-aafa-4856-b959-18e7fb0d45e4","Type":"ContainerDied","Data":"712d19b5819e26e5b751b49dc13aa71c2047dd6299dc6e0921f50879161a7eb6"} Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.176877 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.223393 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-inventory\") pod \"152a7129-aafa-4856-b959-18e7fb0d45e4\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.223469 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hxr\" (UniqueName: \"kubernetes.io/projected/152a7129-aafa-4856-b959-18e7fb0d45e4-kube-api-access-r2hxr\") pod \"152a7129-aafa-4856-b959-18e7fb0d45e4\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.229463 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/152a7129-aafa-4856-b959-18e7fb0d45e4-kube-api-access-r2hxr" (OuterVolumeSpecName: "kube-api-access-r2hxr") pod "152a7129-aafa-4856-b959-18e7fb0d45e4" (UID: "152a7129-aafa-4856-b959-18e7fb0d45e4"). InnerVolumeSpecName "kube-api-access-r2hxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.256563 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-inventory" (OuterVolumeSpecName: "inventory") pod "152a7129-aafa-4856-b959-18e7fb0d45e4" (UID: "152a7129-aafa-4856-b959-18e7fb0d45e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.326049 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ceph\") pod \"152a7129-aafa-4856-b959-18e7fb0d45e4\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.326319 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ssh-key\") pod \"152a7129-aafa-4856-b959-18e7fb0d45e4\" (UID: \"152a7129-aafa-4856-b959-18e7fb0d45e4\") " Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.327639 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hxr\" (UniqueName: \"kubernetes.io/projected/152a7129-aafa-4856-b959-18e7fb0d45e4-kube-api-access-r2hxr\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.327668 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.335398 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ceph" (OuterVolumeSpecName: "ceph") pod "152a7129-aafa-4856-b959-18e7fb0d45e4" (UID: "152a7129-aafa-4856-b959-18e7fb0d45e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.360182 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "152a7129-aafa-4856-b959-18e7fb0d45e4" (UID: "152a7129-aafa-4856-b959-18e7fb0d45e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.429492 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.429826 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/152a7129-aafa-4856-b959-18e7fb0d45e4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.471620 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.547237 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.727696 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fk6kz"] Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.734726 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.734716 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh" event={"ID":"152a7129-aafa-4856-b959-18e7fb0d45e4","Type":"ContainerDied","Data":"0a31153c5d003fcb7e46435da692aa73883cfd00c4145b96a65a7eada5fc4808"} Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.734818 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a31153c5d003fcb7e46435da692aa73883cfd00c4145b96a65a7eada5fc4808" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.846613 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92"] Nov 22 03:28:53 crc kubenswrapper[4922]: E1122 03:28:53.847404 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="152a7129-aafa-4856-b959-18e7fb0d45e4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.847438 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="152a7129-aafa-4856-b959-18e7fb0d45e4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.847776 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="152a7129-aafa-4856-b959-18e7fb0d45e4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.848895 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.851888 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.852182 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.852661 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.852864 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.853059 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.865599 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92"] Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.941931 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.941995 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.942126 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:53 crc kubenswrapper[4922]: I1122 03:28:53.942164 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxkwd\" (UniqueName: \"kubernetes.io/projected/ffcbca43-e19c-4f81-8712-18271858ace9-kube-api-access-vxkwd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.044035 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.044112 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.044259 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.044310 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxkwd\" (UniqueName: \"kubernetes.io/projected/ffcbca43-e19c-4f81-8712-18271858ace9-kube-api-access-vxkwd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.050498 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.050907 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.052719 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.070073 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxkwd\" (UniqueName: \"kubernetes.io/projected/ffcbca43-e19c-4f81-8712-18271858ace9-kube-api-access-vxkwd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b4t92\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.183128 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.743769 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fk6kz" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="registry-server" containerID="cri-o://2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41" gracePeriod=2 Nov 22 03:28:54 crc kubenswrapper[4922]: I1122 03:28:54.807644 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92"] Nov 22 03:28:54 crc kubenswrapper[4922]: W1122 03:28:54.808033 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffcbca43_e19c_4f81_8712_18271858ace9.slice/crio-135372ecbfe3422910fc3d68abe3f556d06e2a38bf77c830a376427694bb5849 WatchSource:0}: Error finding container 135372ecbfe3422910fc3d68abe3f556d06e2a38bf77c830a376427694bb5849: Status 404 returned error can't find the container with id 135372ecbfe3422910fc3d68abe3f556d06e2a38bf77c830a376427694bb5849 Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.239012 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.370598 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-utilities\") pod \"01d79e84-3dcf-4285-b687-59034fc3120c\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.370700 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-catalog-content\") pod \"01d79e84-3dcf-4285-b687-59034fc3120c\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.370809 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9m9w\" (UniqueName: \"kubernetes.io/projected/01d79e84-3dcf-4285-b687-59034fc3120c-kube-api-access-r9m9w\") pod \"01d79e84-3dcf-4285-b687-59034fc3120c\" (UID: \"01d79e84-3dcf-4285-b687-59034fc3120c\") " Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.371167 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-utilities" (OuterVolumeSpecName: "utilities") pod "01d79e84-3dcf-4285-b687-59034fc3120c" (UID: "01d79e84-3dcf-4285-b687-59034fc3120c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.371602 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.377747 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d79e84-3dcf-4285-b687-59034fc3120c-kube-api-access-r9m9w" (OuterVolumeSpecName: "kube-api-access-r9m9w") pod "01d79e84-3dcf-4285-b687-59034fc3120c" (UID: "01d79e84-3dcf-4285-b687-59034fc3120c"). InnerVolumeSpecName "kube-api-access-r9m9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.462091 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01d79e84-3dcf-4285-b687-59034fc3120c" (UID: "01d79e84-3dcf-4285-b687-59034fc3120c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.474527 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d79e84-3dcf-4285-b687-59034fc3120c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.474557 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9m9w\" (UniqueName: \"kubernetes.io/projected/01d79e84-3dcf-4285-b687-59034fc3120c-kube-api-access-r9m9w\") on node \"crc\" DevicePath \"\"" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.754597 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" event={"ID":"ffcbca43-e19c-4f81-8712-18271858ace9","Type":"ContainerStarted","Data":"135372ecbfe3422910fc3d68abe3f556d06e2a38bf77c830a376427694bb5849"} Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.759424 4922 generic.go:334] "Generic (PLEG): container finished" podID="01d79e84-3dcf-4285-b687-59034fc3120c" containerID="2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41" exitCode=0 Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.759512 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerDied","Data":"2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41"} Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.759517 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk6kz" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.759590 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk6kz" event={"ID":"01d79e84-3dcf-4285-b687-59034fc3120c","Type":"ContainerDied","Data":"859cf472aa35b2eec5d4cb6e7a2ab325781178a06377be03bd7937968228e615"} Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.759638 4922 scope.go:117] "RemoveContainer" containerID="2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.785555 4922 scope.go:117] "RemoveContainer" containerID="c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.838228 4922 scope.go:117] "RemoveContainer" containerID="d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.847210 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fk6kz"] Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.862078 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fk6kz"] Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.867049 4922 scope.go:117] "RemoveContainer" containerID="2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41" Nov 22 03:28:55 crc kubenswrapper[4922]: E1122 03:28:55.867704 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41\": container with ID starting with 2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41 not found: ID does not exist" containerID="2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.867757 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41"} err="failed to get container status \"2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41\": rpc error: code = NotFound desc = could not find container \"2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41\": container with ID starting with 2099b0b14053b08a81d1db6184772f4f1d7a2ed39cbb29db68f05dc4c730db41 not found: ID does not exist" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.867791 4922 scope.go:117] "RemoveContainer" containerID="c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4" Nov 22 03:28:55 crc kubenswrapper[4922]: E1122 03:28:55.869163 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4\": container with ID starting with c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4 not found: ID does not exist" containerID="c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.869237 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4"} err="failed to get container status \"c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4\": rpc error: code = NotFound desc = could not find container \"c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4\": container with ID starting with c3ded72987fe5c827589b05bd6f840adead4a196ef3b0e82281596c03611f8e4 not found: ID does not exist" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.869275 4922 scope.go:117] "RemoveContainer" containerID="d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7" Nov 22 03:28:55 crc kubenswrapper[4922]: E1122 03:28:55.869941 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7\": container with ID starting with d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7 not found: ID does not exist" containerID="d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7" Nov 22 03:28:55 crc kubenswrapper[4922]: I1122 03:28:55.869994 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7"} err="failed to get container status \"d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7\": rpc error: code = NotFound desc = could not find container \"d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7\": container with ID starting with d32c3e4b40da00b1eefec83d4c0258bcb129b854dba5fe6c23c20e910f0849d7 not found: ID does not exist" Nov 22 03:28:56 crc kubenswrapper[4922]: I1122 03:28:56.771973 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" event={"ID":"ffcbca43-e19c-4f81-8712-18271858ace9","Type":"ContainerStarted","Data":"9fff863f0709d5aa20d72004b2286416dfd49a4acdd853ef75ead9145494fdc4"} Nov 22 03:28:56 crc kubenswrapper[4922]: I1122 03:28:56.802774 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" podStartSLOduration=3.008155174 podStartE2EDuration="3.802749723s" podCreationTimestamp="2025-11-22 03:28:53 +0000 UTC" firstStartedPulling="2025-11-22 03:28:54.810366107 +0000 UTC m=+2170.848887999" lastFinishedPulling="2025-11-22 03:28:55.604960616 +0000 UTC m=+2171.643482548" observedRunningTime="2025-11-22 03:28:56.795271002 +0000 UTC m=+2172.833792934" watchObservedRunningTime="2025-11-22 03:28:56.802749723 +0000 UTC m=+2172.841271655" Nov 22 03:28:57 crc kubenswrapper[4922]: I1122 03:28:57.327041 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" path="/var/lib/kubelet/pods/01d79e84-3dcf-4285-b687-59034fc3120c/volumes" Nov 22 03:29:01 crc kubenswrapper[4922]: I1122 03:29:01.829473 4922 generic.go:334] "Generic (PLEG): container finished" podID="ffcbca43-e19c-4f81-8712-18271858ace9" containerID="9fff863f0709d5aa20d72004b2286416dfd49a4acdd853ef75ead9145494fdc4" exitCode=0 Nov 22 03:29:01 crc kubenswrapper[4922]: I1122 03:29:01.829581 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" event={"ID":"ffcbca43-e19c-4f81-8712-18271858ace9","Type":"ContainerDied","Data":"9fff863f0709d5aa20d72004b2286416dfd49a4acdd853ef75ead9145494fdc4"} Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.361757 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.445595 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-inventory\") pod \"ffcbca43-e19c-4f81-8712-18271858ace9\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.445735 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ceph\") pod \"ffcbca43-e19c-4f81-8712-18271858ace9\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.445803 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ssh-key\") pod \"ffcbca43-e19c-4f81-8712-18271858ace9\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.445895 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxkwd\" (UniqueName: \"kubernetes.io/projected/ffcbca43-e19c-4f81-8712-18271858ace9-kube-api-access-vxkwd\") pod \"ffcbca43-e19c-4f81-8712-18271858ace9\" (UID: \"ffcbca43-e19c-4f81-8712-18271858ace9\") " Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.453177 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffcbca43-e19c-4f81-8712-18271858ace9-kube-api-access-vxkwd" (OuterVolumeSpecName: "kube-api-access-vxkwd") pod "ffcbca43-e19c-4f81-8712-18271858ace9" (UID: "ffcbca43-e19c-4f81-8712-18271858ace9"). InnerVolumeSpecName "kube-api-access-vxkwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.453187 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ceph" (OuterVolumeSpecName: "ceph") pod "ffcbca43-e19c-4f81-8712-18271858ace9" (UID: "ffcbca43-e19c-4f81-8712-18271858ace9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.472595 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-inventory" (OuterVolumeSpecName: "inventory") pod "ffcbca43-e19c-4f81-8712-18271858ace9" (UID: "ffcbca43-e19c-4f81-8712-18271858ace9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.480860 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ffcbca43-e19c-4f81-8712-18271858ace9" (UID: "ffcbca43-e19c-4f81-8712-18271858ace9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.548614 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.548667 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxkwd\" (UniqueName: \"kubernetes.io/projected/ffcbca43-e19c-4f81-8712-18271858ace9-kube-api-access-vxkwd\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.548687 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.548703 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ffcbca43-e19c-4f81-8712-18271858ace9-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.859012 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" event={"ID":"ffcbca43-e19c-4f81-8712-18271858ace9","Type":"ContainerDied","Data":"135372ecbfe3422910fc3d68abe3f556d06e2a38bf77c830a376427694bb5849"} Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.859075 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135372ecbfe3422910fc3d68abe3f556d06e2a38bf77c830a376427694bb5849" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.859162 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b4t92" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.971999 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc"] Nov 22 03:29:03 crc kubenswrapper[4922]: E1122 03:29:03.972655 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="extract-content" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.972691 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="extract-content" Nov 22 03:29:03 crc kubenswrapper[4922]: E1122 03:29:03.972726 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcbca43-e19c-4f81-8712-18271858ace9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.972741 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcbca43-e19c-4f81-8712-18271858ace9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:03 crc kubenswrapper[4922]: E1122 03:29:03.972783 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="registry-server" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.972797 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="registry-server" Nov 22 03:29:03 crc kubenswrapper[4922]: E1122 03:29:03.972817 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="extract-utilities" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.972830 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="extract-utilities" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.973172 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d79e84-3dcf-4285-b687-59034fc3120c" containerName="registry-server" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.973195 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffcbca43-e19c-4f81-8712-18271858ace9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.974307 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.977598 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.978010 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.978287 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.978548 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.978936 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:29:03 crc kubenswrapper[4922]: I1122 03:29:03.992526 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc"] Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.059532 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.059870 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dh5\" (UniqueName: \"kubernetes.io/projected/6a294154-9cc0-47ea-86d5-58e825a739e4-kube-api-access-s8dh5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.060035 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.060204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.162427 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.163142 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dh5\" (UniqueName: \"kubernetes.io/projected/6a294154-9cc0-47ea-86d5-58e825a739e4-kube-api-access-s8dh5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.163638 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.163942 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.171603 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.172971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.172965 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.195706 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dh5\" (UniqueName: \"kubernetes.io/projected/6a294154-9cc0-47ea-86d5-58e825a739e4-kube-api-access-s8dh5\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mcptc\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.296339 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:04 crc kubenswrapper[4922]: I1122 03:29:04.857668 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc"] Nov 22 03:29:05 crc kubenswrapper[4922]: I1122 03:29:05.895167 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" event={"ID":"6a294154-9cc0-47ea-86d5-58e825a739e4","Type":"ContainerStarted","Data":"ad74a7e364b108069bf123fca3a8e8e7da09988b22e301639e6b2e7855e941d3"} Nov 22 03:29:05 crc kubenswrapper[4922]: I1122 03:29:05.895564 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" event={"ID":"6a294154-9cc0-47ea-86d5-58e825a739e4","Type":"ContainerStarted","Data":"8db8118b3bdd90e13d2e8ec8a0bddd2a274a15d8d494b36d5b7a1f0b7dbdc289"} Nov 22 03:29:05 crc kubenswrapper[4922]: I1122 03:29:05.924566 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" podStartSLOduration=2.485844643 podStartE2EDuration="2.924543536s" podCreationTimestamp="2025-11-22 03:29:03 +0000 UTC" firstStartedPulling="2025-11-22 03:29:04.874352497 +0000 UTC m=+2180.912874409" lastFinishedPulling="2025-11-22 03:29:05.31305141 +0000 UTC m=+2181.351573302" observedRunningTime="2025-11-22 03:29:05.918630224 +0000 UTC m=+2181.957152126" watchObservedRunningTime="2025-11-22 03:29:05.924543536 +0000 UTC m=+2181.963065438" Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.109331 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.109804 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.109907 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.110990 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.111095 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" gracePeriod=600 Nov 22 03:29:11 crc kubenswrapper[4922]: E1122 03:29:11.242798 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.961595 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" exitCode=0 Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.961641 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78"} Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.961702 4922 scope.go:117] "RemoveContainer" containerID="867c06d58af11fe828ea1cbbaee07531aff1fe64bb6f347eacb739cad746f777" Nov 22 03:29:11 crc kubenswrapper[4922]: I1122 03:29:11.962171 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:29:11 crc kubenswrapper[4922]: E1122 03:29:11.962618 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:29:27 crc kubenswrapper[4922]: I1122 03:29:27.300656 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:29:27 crc kubenswrapper[4922]: E1122 03:29:27.301339 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:29:41 crc kubenswrapper[4922]: I1122 03:29:41.301547 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:29:41 crc kubenswrapper[4922]: E1122 03:29:41.302812 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:29:47 crc kubenswrapper[4922]: I1122 03:29:47.407648 4922 generic.go:334] "Generic (PLEG): container finished" podID="6a294154-9cc0-47ea-86d5-58e825a739e4" containerID="ad74a7e364b108069bf123fca3a8e8e7da09988b22e301639e6b2e7855e941d3" exitCode=0 Nov 22 03:29:47 crc kubenswrapper[4922]: I1122 03:29:47.407692 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" event={"ID":"6a294154-9cc0-47ea-86d5-58e825a739e4","Type":"ContainerDied","Data":"ad74a7e364b108069bf123fca3a8e8e7da09988b22e301639e6b2e7855e941d3"} Nov 22 03:29:48 crc kubenswrapper[4922]: I1122 03:29:48.912033 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.032176 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ceph\") pod \"6a294154-9cc0-47ea-86d5-58e825a739e4\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.032261 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ssh-key\") pod \"6a294154-9cc0-47ea-86d5-58e825a739e4\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.032321 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-inventory\") pod \"6a294154-9cc0-47ea-86d5-58e825a739e4\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.032399 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8dh5\" (UniqueName: \"kubernetes.io/projected/6a294154-9cc0-47ea-86d5-58e825a739e4-kube-api-access-s8dh5\") pod \"6a294154-9cc0-47ea-86d5-58e825a739e4\" (UID: \"6a294154-9cc0-47ea-86d5-58e825a739e4\") " Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.038328 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ceph" (OuterVolumeSpecName: "ceph") pod "6a294154-9cc0-47ea-86d5-58e825a739e4" (UID: "6a294154-9cc0-47ea-86d5-58e825a739e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.038693 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a294154-9cc0-47ea-86d5-58e825a739e4-kube-api-access-s8dh5" (OuterVolumeSpecName: "kube-api-access-s8dh5") pod "6a294154-9cc0-47ea-86d5-58e825a739e4" (UID: "6a294154-9cc0-47ea-86d5-58e825a739e4"). InnerVolumeSpecName "kube-api-access-s8dh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.061106 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-inventory" (OuterVolumeSpecName: "inventory") pod "6a294154-9cc0-47ea-86d5-58e825a739e4" (UID: "6a294154-9cc0-47ea-86d5-58e825a739e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.076257 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a294154-9cc0-47ea-86d5-58e825a739e4" (UID: "6a294154-9cc0-47ea-86d5-58e825a739e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.134065 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.134102 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.134115 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a294154-9cc0-47ea-86d5-58e825a739e4-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.134127 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8dh5\" (UniqueName: \"kubernetes.io/projected/6a294154-9cc0-47ea-86d5-58e825a739e4-kube-api-access-s8dh5\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.434227 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" event={"ID":"6a294154-9cc0-47ea-86d5-58e825a739e4","Type":"ContainerDied","Data":"8db8118b3bdd90e13d2e8ec8a0bddd2a274a15d8d494b36d5b7a1f0b7dbdc289"} Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.434289 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8db8118b3bdd90e13d2e8ec8a0bddd2a274a15d8d494b36d5b7a1f0b7dbdc289" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.434355 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mcptc" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.560128 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs"] Nov 22 03:29:49 crc kubenswrapper[4922]: E1122 03:29:49.561473 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a294154-9cc0-47ea-86d5-58e825a739e4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.561504 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a294154-9cc0-47ea-86d5-58e825a739e4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.561793 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a294154-9cc0-47ea-86d5-58e825a739e4" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.563039 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.579320 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.579646 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.580774 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.581363 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.581625 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.597003 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs"] Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.746958 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.747684 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.747818 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c66b\" (UniqueName: \"kubernetes.io/projected/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-kube-api-access-7c66b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.748049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.850352 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.850482 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.850542 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c66b\" (UniqueName: \"kubernetes.io/projected/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-kube-api-access-7c66b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.850641 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.857259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.857959 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.861454 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.873278 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c66b\" (UniqueName: \"kubernetes.io/projected/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-kube-api-access-7c66b\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:49 crc kubenswrapper[4922]: I1122 03:29:49.899725 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:50 crc kubenswrapper[4922]: I1122 03:29:50.250746 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs"] Nov 22 03:29:50 crc kubenswrapper[4922]: I1122 03:29:50.444948 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" event={"ID":"e2f8ad2b-4101-4bfb-b181-bcbff8f80498","Type":"ContainerStarted","Data":"c2f6693148d8df5228184f1252763b4c6d78ab5e5cba9f5f9f7d3ab77ae1a7ad"} Nov 22 03:29:51 crc kubenswrapper[4922]: I1122 03:29:51.455048 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" event={"ID":"e2f8ad2b-4101-4bfb-b181-bcbff8f80498","Type":"ContainerStarted","Data":"d274abb0172fdf493f225216d2d20d047459c2257327bdca32a6f7af0189ee8d"} Nov 22 03:29:51 crc kubenswrapper[4922]: I1122 03:29:51.477119 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" podStartSLOduration=1.741846472 podStartE2EDuration="2.477097754s" podCreationTimestamp="2025-11-22 03:29:49 +0000 UTC" firstStartedPulling="2025-11-22 03:29:50.268816778 +0000 UTC m=+2226.307338670" lastFinishedPulling="2025-11-22 03:29:51.00406807 +0000 UTC m=+2227.042589952" observedRunningTime="2025-11-22 03:29:51.475738901 +0000 UTC m=+2227.514260843" watchObservedRunningTime="2025-11-22 03:29:51.477097754 +0000 UTC m=+2227.515619696" Nov 22 03:29:55 crc kubenswrapper[4922]: I1122 03:29:55.496323 4922 generic.go:334] "Generic (PLEG): container finished" podID="e2f8ad2b-4101-4bfb-b181-bcbff8f80498" containerID="d274abb0172fdf493f225216d2d20d047459c2257327bdca32a6f7af0189ee8d" exitCode=0 Nov 22 03:29:55 crc kubenswrapper[4922]: I1122 03:29:55.496538 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" event={"ID":"e2f8ad2b-4101-4bfb-b181-bcbff8f80498","Type":"ContainerDied","Data":"d274abb0172fdf493f225216d2d20d047459c2257327bdca32a6f7af0189ee8d"} Nov 22 03:29:56 crc kubenswrapper[4922]: I1122 03:29:56.300837 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:29:56 crc kubenswrapper[4922]: E1122 03:29:56.301164 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:29:56 crc kubenswrapper[4922]: I1122 03:29:56.938436 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.101236 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ssh-key\") pod \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.101329 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c66b\" (UniqueName: \"kubernetes.io/projected/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-kube-api-access-7c66b\") pod \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.101487 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-inventory\") pod \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.101630 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ceph\") pod \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\" (UID: \"e2f8ad2b-4101-4bfb-b181-bcbff8f80498\") " Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.109586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ceph" (OuterVolumeSpecName: "ceph") pod "e2f8ad2b-4101-4bfb-b181-bcbff8f80498" (UID: "e2f8ad2b-4101-4bfb-b181-bcbff8f80498"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.110085 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-kube-api-access-7c66b" (OuterVolumeSpecName: "kube-api-access-7c66b") pod "e2f8ad2b-4101-4bfb-b181-bcbff8f80498" (UID: "e2f8ad2b-4101-4bfb-b181-bcbff8f80498"). InnerVolumeSpecName "kube-api-access-7c66b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.129396 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2f8ad2b-4101-4bfb-b181-bcbff8f80498" (UID: "e2f8ad2b-4101-4bfb-b181-bcbff8f80498"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.145147 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-inventory" (OuterVolumeSpecName: "inventory") pod "e2f8ad2b-4101-4bfb-b181-bcbff8f80498" (UID: "e2f8ad2b-4101-4bfb-b181-bcbff8f80498"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.205512 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.205571 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c66b\" (UniqueName: \"kubernetes.io/projected/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-kube-api-access-7c66b\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.205594 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.205612 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2f8ad2b-4101-4bfb-b181-bcbff8f80498-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.515983 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" event={"ID":"e2f8ad2b-4101-4bfb-b181-bcbff8f80498","Type":"ContainerDied","Data":"c2f6693148d8df5228184f1252763b4c6d78ab5e5cba9f5f9f7d3ab77ae1a7ad"} Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.516042 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f6693148d8df5228184f1252763b4c6d78ab5e5cba9f5f9f7d3ab77ae1a7ad" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.516087 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.602563 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm"] Nov 22 03:29:57 crc kubenswrapper[4922]: E1122 03:29:57.603329 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f8ad2b-4101-4bfb-b181-bcbff8f80498" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.603378 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f8ad2b-4101-4bfb-b181-bcbff8f80498" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.603811 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f8ad2b-4101-4bfb-b181-bcbff8f80498" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.605022 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.607577 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.607617 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.607654 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.608405 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.609430 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm"] Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.609747 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.610903 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.610944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.610988 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.611033 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5fbd\" (UniqueName: \"kubernetes.io/projected/da35a869-b48f-4972-9a3f-703498998c6d-kube-api-access-w5fbd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.712200 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5fbd\" (UniqueName: \"kubernetes.io/projected/da35a869-b48f-4972-9a3f-703498998c6d-kube-api-access-w5fbd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.712639 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.712678 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.712718 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.718243 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.723358 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.723522 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.731178 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5fbd\" (UniqueName: \"kubernetes.io/projected/da35a869-b48f-4972-9a3f-703498998c6d-kube-api-access-w5fbd\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:57 crc kubenswrapper[4922]: I1122 03:29:57.950974 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:29:58 crc kubenswrapper[4922]: I1122 03:29:58.420494 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm"] Nov 22 03:29:58 crc kubenswrapper[4922]: I1122 03:29:58.525232 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" event={"ID":"da35a869-b48f-4972-9a3f-703498998c6d","Type":"ContainerStarted","Data":"47f64fd4123c5a64b2bb72e01233454ffba5f40e7ec651185861222895732fe6"} Nov 22 03:29:59 crc kubenswrapper[4922]: I1122 03:29:59.538084 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" event={"ID":"da35a869-b48f-4972-9a3f-703498998c6d","Type":"ContainerStarted","Data":"08ac0806a48e8c5ca683ed9444c645464cd9e22aaf7483974f716293870b693f"} Nov 22 03:29:59 crc kubenswrapper[4922]: I1122 03:29:59.568331 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" podStartSLOduration=2.135390002 podStartE2EDuration="2.568300479s" podCreationTimestamp="2025-11-22 03:29:57 +0000 UTC" firstStartedPulling="2025-11-22 03:29:58.422600807 +0000 UTC m=+2234.461122709" lastFinishedPulling="2025-11-22 03:29:58.855511274 +0000 UTC m=+2234.894033186" observedRunningTime="2025-11-22 03:29:59.557220789 +0000 UTC m=+2235.595742691" watchObservedRunningTime="2025-11-22 03:29:59.568300479 +0000 UTC m=+2235.606822411" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.157023 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm"] Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.159747 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.164144 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.164165 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.175236 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm"] Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.264082 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13f70de8-814c-42f0-900d-ed7ade87ecc8-config-volume\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.264541 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwd2n\" (UniqueName: \"kubernetes.io/projected/13f70de8-814c-42f0-900d-ed7ade87ecc8-kube-api-access-bwd2n\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.264617 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13f70de8-814c-42f0-900d-ed7ade87ecc8-secret-volume\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.366539 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13f70de8-814c-42f0-900d-ed7ade87ecc8-config-volume\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.366662 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwd2n\" (UniqueName: \"kubernetes.io/projected/13f70de8-814c-42f0-900d-ed7ade87ecc8-kube-api-access-bwd2n\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.366802 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13f70de8-814c-42f0-900d-ed7ade87ecc8-secret-volume\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.368750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13f70de8-814c-42f0-900d-ed7ade87ecc8-config-volume\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.376473 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13f70de8-814c-42f0-900d-ed7ade87ecc8-secret-volume\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.395428 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwd2n\" (UniqueName: \"kubernetes.io/projected/13f70de8-814c-42f0-900d-ed7ade87ecc8-kube-api-access-bwd2n\") pod \"collect-profiles-29396370-xgdhm\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.500722 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:00 crc kubenswrapper[4922]: I1122 03:30:00.979780 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm"] Nov 22 03:30:01 crc kubenswrapper[4922]: I1122 03:30:01.561360 4922 generic.go:334] "Generic (PLEG): container finished" podID="13f70de8-814c-42f0-900d-ed7ade87ecc8" containerID="6319c0ad4dfc39e8cc40f08ae6ff8943d8c1ee7bab1690fa3ff6761fd77051c7" exitCode=0 Nov 22 03:30:01 crc kubenswrapper[4922]: I1122 03:30:01.561409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" event={"ID":"13f70de8-814c-42f0-900d-ed7ade87ecc8","Type":"ContainerDied","Data":"6319c0ad4dfc39e8cc40f08ae6ff8943d8c1ee7bab1690fa3ff6761fd77051c7"} Nov 22 03:30:01 crc kubenswrapper[4922]: I1122 03:30:01.561681 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" event={"ID":"13f70de8-814c-42f0-900d-ed7ade87ecc8","Type":"ContainerStarted","Data":"fb862cd236a085ff43df0bdead47d873e2bfe74d4f254b1184bf4d32145f0d4b"} Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.917542 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.951716 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwd2n\" (UniqueName: \"kubernetes.io/projected/13f70de8-814c-42f0-900d-ed7ade87ecc8-kube-api-access-bwd2n\") pod \"13f70de8-814c-42f0-900d-ed7ade87ecc8\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.951795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13f70de8-814c-42f0-900d-ed7ade87ecc8-secret-volume\") pod \"13f70de8-814c-42f0-900d-ed7ade87ecc8\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.951906 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13f70de8-814c-42f0-900d-ed7ade87ecc8-config-volume\") pod \"13f70de8-814c-42f0-900d-ed7ade87ecc8\" (UID: \"13f70de8-814c-42f0-900d-ed7ade87ecc8\") " Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.957880 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13f70de8-814c-42f0-900d-ed7ade87ecc8-config-volume" (OuterVolumeSpecName: "config-volume") pod "13f70de8-814c-42f0-900d-ed7ade87ecc8" (UID: "13f70de8-814c-42f0-900d-ed7ade87ecc8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.964729 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f70de8-814c-42f0-900d-ed7ade87ecc8-kube-api-access-bwd2n" (OuterVolumeSpecName: "kube-api-access-bwd2n") pod "13f70de8-814c-42f0-900d-ed7ade87ecc8" (UID: "13f70de8-814c-42f0-900d-ed7ade87ecc8"). InnerVolumeSpecName "kube-api-access-bwd2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:30:02 crc kubenswrapper[4922]: I1122 03:30:02.964769 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f70de8-814c-42f0-900d-ed7ade87ecc8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13f70de8-814c-42f0-900d-ed7ade87ecc8" (UID: "13f70de8-814c-42f0-900d-ed7ade87ecc8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:03 crc kubenswrapper[4922]: I1122 03:30:03.062639 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwd2n\" (UniqueName: \"kubernetes.io/projected/13f70de8-814c-42f0-900d-ed7ade87ecc8-kube-api-access-bwd2n\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:03 crc kubenswrapper[4922]: I1122 03:30:03.062716 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13f70de8-814c-42f0-900d-ed7ade87ecc8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:03 crc kubenswrapper[4922]: I1122 03:30:03.062742 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13f70de8-814c-42f0-900d-ed7ade87ecc8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:03 crc kubenswrapper[4922]: I1122 03:30:03.583481 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" event={"ID":"13f70de8-814c-42f0-900d-ed7ade87ecc8","Type":"ContainerDied","Data":"fb862cd236a085ff43df0bdead47d873e2bfe74d4f254b1184bf4d32145f0d4b"} Nov 22 03:30:03 crc kubenswrapper[4922]: I1122 03:30:03.584006 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb862cd236a085ff43df0bdead47d873e2bfe74d4f254b1184bf4d32145f0d4b" Nov 22 03:30:03 crc kubenswrapper[4922]: I1122 03:30:03.583627 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm" Nov 22 03:30:04 crc kubenswrapper[4922]: I1122 03:30:04.014992 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv"] Nov 22 03:30:04 crc kubenswrapper[4922]: I1122 03:30:04.029335 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396325-vl7tv"] Nov 22 03:30:05 crc kubenswrapper[4922]: I1122 03:30:05.321242 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e7f9ae-8050-4602-b4f3-53a73ec0b60f" path="/var/lib/kubelet/pods/a5e7f9ae-8050-4602-b4f3-53a73ec0b60f/volumes" Nov 22 03:30:11 crc kubenswrapper[4922]: I1122 03:30:11.300596 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:30:11 crc kubenswrapper[4922]: E1122 03:30:11.302182 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:30:26 crc kubenswrapper[4922]: I1122 03:30:26.300281 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:30:26 crc kubenswrapper[4922]: E1122 03:30:26.300967 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:30:41 crc kubenswrapper[4922]: I1122 03:30:41.300360 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:30:41 crc kubenswrapper[4922]: E1122 03:30:41.301313 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:30:47 crc kubenswrapper[4922]: I1122 03:30:47.044556 4922 generic.go:334] "Generic (PLEG): container finished" podID="da35a869-b48f-4972-9a3f-703498998c6d" containerID="08ac0806a48e8c5ca683ed9444c645464cd9e22aaf7483974f716293870b693f" exitCode=0 Nov 22 03:30:47 crc kubenswrapper[4922]: I1122 03:30:47.044632 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" event={"ID":"da35a869-b48f-4972-9a3f-703498998c6d","Type":"ContainerDied","Data":"08ac0806a48e8c5ca683ed9444c645464cd9e22aaf7483974f716293870b693f"} Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.590142 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.736327 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-inventory\") pod \"da35a869-b48f-4972-9a3f-703498998c6d\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.736409 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ceph\") pod \"da35a869-b48f-4972-9a3f-703498998c6d\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.736473 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ssh-key\") pod \"da35a869-b48f-4972-9a3f-703498998c6d\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.736616 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5fbd\" (UniqueName: \"kubernetes.io/projected/da35a869-b48f-4972-9a3f-703498998c6d-kube-api-access-w5fbd\") pod \"da35a869-b48f-4972-9a3f-703498998c6d\" (UID: \"da35a869-b48f-4972-9a3f-703498998c6d\") " Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.743047 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da35a869-b48f-4972-9a3f-703498998c6d-kube-api-access-w5fbd" (OuterVolumeSpecName: "kube-api-access-w5fbd") pod "da35a869-b48f-4972-9a3f-703498998c6d" (UID: "da35a869-b48f-4972-9a3f-703498998c6d"). InnerVolumeSpecName "kube-api-access-w5fbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.745586 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ceph" (OuterVolumeSpecName: "ceph") pod "da35a869-b48f-4972-9a3f-703498998c6d" (UID: "da35a869-b48f-4972-9a3f-703498998c6d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.773331 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-inventory" (OuterVolumeSpecName: "inventory") pod "da35a869-b48f-4972-9a3f-703498998c6d" (UID: "da35a869-b48f-4972-9a3f-703498998c6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.777688 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "da35a869-b48f-4972-9a3f-703498998c6d" (UID: "da35a869-b48f-4972-9a3f-703498998c6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.840534 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5fbd\" (UniqueName: \"kubernetes.io/projected/da35a869-b48f-4972-9a3f-703498998c6d-kube-api-access-w5fbd\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.840574 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.840586 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:48 crc kubenswrapper[4922]: I1122 03:30:48.840597 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/da35a869-b48f-4972-9a3f-703498998c6d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.067622 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" event={"ID":"da35a869-b48f-4972-9a3f-703498998c6d","Type":"ContainerDied","Data":"47f64fd4123c5a64b2bb72e01233454ffba5f40e7ec651185861222895732fe6"} Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.067675 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47f64fd4123c5a64b2bb72e01233454ffba5f40e7ec651185861222895732fe6" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.067752 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.165134 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b5522"] Nov 22 03:30:49 crc kubenswrapper[4922]: E1122 03:30:49.165510 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f70de8-814c-42f0-900d-ed7ade87ecc8" containerName="collect-profiles" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.165527 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f70de8-814c-42f0-900d-ed7ade87ecc8" containerName="collect-profiles" Nov 22 03:30:49 crc kubenswrapper[4922]: E1122 03:30:49.165542 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da35a869-b48f-4972-9a3f-703498998c6d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.165549 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="da35a869-b48f-4972-9a3f-703498998c6d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.165736 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="da35a869-b48f-4972-9a3f-703498998c6d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.165760 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f70de8-814c-42f0-900d-ed7ade87ecc8" containerName="collect-profiles" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.166348 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.168624 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.168834 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.168960 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.170143 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.170299 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.179356 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b5522"] Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.348665 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.348737 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.348883 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp54m\" (UniqueName: \"kubernetes.io/projected/4010b4bb-bf3c-401b-ab08-bb238b56934a-kube-api-access-qp54m\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.349154 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ceph\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.450758 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp54m\" (UniqueName: \"kubernetes.io/projected/4010b4bb-bf3c-401b-ab08-bb238b56934a-kube-api-access-qp54m\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.450936 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ceph\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.451022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.451060 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.456372 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ceph\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.456690 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.469545 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.480230 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp54m\" (UniqueName: \"kubernetes.io/projected/4010b4bb-bf3c-401b-ab08-bb238b56934a-kube-api-access-qp54m\") pod \"ssh-known-hosts-edpm-deployment-b5522\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:49 crc kubenswrapper[4922]: I1122 03:30:49.498729 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:30:50 crc kubenswrapper[4922]: I1122 03:30:50.075384 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b5522"] Nov 22 03:30:50 crc kubenswrapper[4922]: W1122 03:30:50.094988 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4010b4bb_bf3c_401b_ab08_bb238b56934a.slice/crio-c919a33c67b4f2b519c24f74739b78ddc0034ed34c263eabd18ae3840a635ea3 WatchSource:0}: Error finding container c919a33c67b4f2b519c24f74739b78ddc0034ed34c263eabd18ae3840a635ea3: Status 404 returned error can't find the container with id c919a33c67b4f2b519c24f74739b78ddc0034ed34c263eabd18ae3840a635ea3 Nov 22 03:30:51 crc kubenswrapper[4922]: I1122 03:30:51.090995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" event={"ID":"4010b4bb-bf3c-401b-ab08-bb238b56934a","Type":"ContainerStarted","Data":"4f80535596bda81862d82e09370b7d4789a2cb4140ea9463936d84b55b607666"} Nov 22 03:30:51 crc kubenswrapper[4922]: I1122 03:30:51.091930 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" event={"ID":"4010b4bb-bf3c-401b-ab08-bb238b56934a","Type":"ContainerStarted","Data":"c919a33c67b4f2b519c24f74739b78ddc0034ed34c263eabd18ae3840a635ea3"} Nov 22 03:30:51 crc kubenswrapper[4922]: I1122 03:30:51.125652 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" podStartSLOduration=1.443571758 podStartE2EDuration="2.125624824s" podCreationTimestamp="2025-11-22 03:30:49 +0000 UTC" firstStartedPulling="2025-11-22 03:30:50.097781244 +0000 UTC m=+2286.136303176" lastFinishedPulling="2025-11-22 03:30:50.77983434 +0000 UTC m=+2286.818356242" observedRunningTime="2025-11-22 03:30:51.1127387 +0000 UTC m=+2287.151260672" watchObservedRunningTime="2025-11-22 03:30:51.125624824 +0000 UTC m=+2287.164146756" Nov 22 03:30:56 crc kubenswrapper[4922]: I1122 03:30:56.300563 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:30:56 crc kubenswrapper[4922]: E1122 03:30:56.301684 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:30:58 crc kubenswrapper[4922]: I1122 03:30:58.540816 4922 scope.go:117] "RemoveContainer" containerID="024beac7b23d601c940bf58c9f3abab9862b801d829aa9988305d392b2459708" Nov 22 03:31:02 crc kubenswrapper[4922]: I1122 03:31:02.212631 4922 generic.go:334] "Generic (PLEG): container finished" podID="4010b4bb-bf3c-401b-ab08-bb238b56934a" containerID="4f80535596bda81862d82e09370b7d4789a2cb4140ea9463936d84b55b607666" exitCode=0 Nov 22 03:31:02 crc kubenswrapper[4922]: I1122 03:31:02.212715 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" event={"ID":"4010b4bb-bf3c-401b-ab08-bb238b56934a","Type":"ContainerDied","Data":"4f80535596bda81862d82e09370b7d4789a2cb4140ea9463936d84b55b607666"} Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.700884 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.860662 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ssh-key-openstack-edpm-ipam\") pod \"4010b4bb-bf3c-401b-ab08-bb238b56934a\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.861067 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-inventory-0\") pod \"4010b4bb-bf3c-401b-ab08-bb238b56934a\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.861113 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp54m\" (UniqueName: \"kubernetes.io/projected/4010b4bb-bf3c-401b-ab08-bb238b56934a-kube-api-access-qp54m\") pod \"4010b4bb-bf3c-401b-ab08-bb238b56934a\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.861152 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ceph\") pod \"4010b4bb-bf3c-401b-ab08-bb238b56934a\" (UID: \"4010b4bb-bf3c-401b-ab08-bb238b56934a\") " Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.866743 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ceph" (OuterVolumeSpecName: "ceph") pod "4010b4bb-bf3c-401b-ab08-bb238b56934a" (UID: "4010b4bb-bf3c-401b-ab08-bb238b56934a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.867347 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4010b4bb-bf3c-401b-ab08-bb238b56934a-kube-api-access-qp54m" (OuterVolumeSpecName: "kube-api-access-qp54m") pod "4010b4bb-bf3c-401b-ab08-bb238b56934a" (UID: "4010b4bb-bf3c-401b-ab08-bb238b56934a"). InnerVolumeSpecName "kube-api-access-qp54m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.895804 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4010b4bb-bf3c-401b-ab08-bb238b56934a" (UID: "4010b4bb-bf3c-401b-ab08-bb238b56934a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.896716 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4010b4bb-bf3c-401b-ab08-bb238b56934a" (UID: "4010b4bb-bf3c-401b-ab08-bb238b56934a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.968694 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.969418 4922 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.969452 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp54m\" (UniqueName: \"kubernetes.io/projected/4010b4bb-bf3c-401b-ab08-bb238b56934a-kube-api-access-qp54m\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:03 crc kubenswrapper[4922]: I1122 03:31:03.969472 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4010b4bb-bf3c-401b-ab08-bb238b56934a-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.235439 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" event={"ID":"4010b4bb-bf3c-401b-ab08-bb238b56934a","Type":"ContainerDied","Data":"c919a33c67b4f2b519c24f74739b78ddc0034ed34c263eabd18ae3840a635ea3"} Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.235750 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c919a33c67b4f2b519c24f74739b78ddc0034ed34c263eabd18ae3840a635ea3" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.235508 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b5522" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.343251 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm"] Nov 22 03:31:04 crc kubenswrapper[4922]: E1122 03:31:04.343664 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4010b4bb-bf3c-401b-ab08-bb238b56934a" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.343690 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4010b4bb-bf3c-401b-ab08-bb238b56934a" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.343999 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4010b4bb-bf3c-401b-ab08-bb238b56934a" containerName="ssh-known-hosts-edpm-deployment" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.344719 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.348157 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.348202 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.348710 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.349133 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.349375 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.377042 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm"] Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.480576 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.481015 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.481224 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n62z\" (UniqueName: \"kubernetes.io/projected/85670af6-7929-40f9-8cb6-ef764c147917-kube-api-access-6n62z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.481551 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: E1122 03:31:04.483979 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4010b4bb_bf3c_401b_ab08_bb238b56934a.slice\": RecentStats: unable to find data in memory cache]" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.583534 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.583640 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.583745 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.583828 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n62z\" (UniqueName: \"kubernetes.io/projected/85670af6-7929-40f9-8cb6-ef764c147917-kube-api-access-6n62z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.590187 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.591315 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.592375 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.616894 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n62z\" (UniqueName: \"kubernetes.io/projected/85670af6-7929-40f9-8cb6-ef764c147917-kube-api-access-6n62z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-hnnvm\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:04 crc kubenswrapper[4922]: I1122 03:31:04.680265 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:05 crc kubenswrapper[4922]: I1122 03:31:05.229646 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm"] Nov 22 03:31:05 crc kubenswrapper[4922]: W1122 03:31:05.237921 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85670af6_7929_40f9_8cb6_ef764c147917.slice/crio-ec98f311bf19a0fb50d5a8b478e00ebb938f27b767465e021ba0c16d07bd4728 WatchSource:0}: Error finding container ec98f311bf19a0fb50d5a8b478e00ebb938f27b767465e021ba0c16d07bd4728: Status 404 returned error can't find the container with id ec98f311bf19a0fb50d5a8b478e00ebb938f27b767465e021ba0c16d07bd4728 Nov 22 03:31:06 crc kubenswrapper[4922]: I1122 03:31:06.253518 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" event={"ID":"85670af6-7929-40f9-8cb6-ef764c147917","Type":"ContainerStarted","Data":"c453d87abb8e5479d8d7f19f1440436773f999e0392ef756c95ab5d661214228"} Nov 22 03:31:06 crc kubenswrapper[4922]: I1122 03:31:06.254102 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" event={"ID":"85670af6-7929-40f9-8cb6-ef764c147917","Type":"ContainerStarted","Data":"ec98f311bf19a0fb50d5a8b478e00ebb938f27b767465e021ba0c16d07bd4728"} Nov 22 03:31:06 crc kubenswrapper[4922]: I1122 03:31:06.276972 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" podStartSLOduration=1.578674736 podStartE2EDuration="2.276951477s" podCreationTimestamp="2025-11-22 03:31:04 +0000 UTC" firstStartedPulling="2025-11-22 03:31:05.239737079 +0000 UTC m=+2301.278258971" lastFinishedPulling="2025-11-22 03:31:05.93801382 +0000 UTC m=+2301.976535712" observedRunningTime="2025-11-22 03:31:06.272734835 +0000 UTC m=+2302.311256737" watchObservedRunningTime="2025-11-22 03:31:06.276951477 +0000 UTC m=+2302.315473379" Nov 22 03:31:09 crc kubenswrapper[4922]: I1122 03:31:09.301600 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:31:09 crc kubenswrapper[4922]: E1122 03:31:09.302335 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.337563 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m57t7"] Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.340291 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.354271 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m57t7"] Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.402629 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2tz9\" (UniqueName: \"kubernetes.io/projected/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-kube-api-access-q2tz9\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.402966 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-utilities\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.403308 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-catalog-content\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.504701 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-utilities\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.504932 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-catalog-content\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.504974 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2tz9\" (UniqueName: \"kubernetes.io/projected/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-kube-api-access-q2tz9\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.505231 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-utilities\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.505464 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-catalog-content\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.538088 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2tz9\" (UniqueName: \"kubernetes.io/projected/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-kube-api-access-q2tz9\") pod \"community-operators-m57t7\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:12 crc kubenswrapper[4922]: I1122 03:31:12.671676 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:13 crc kubenswrapper[4922]: I1122 03:31:13.232972 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m57t7"] Nov 22 03:31:13 crc kubenswrapper[4922]: W1122 03:31:13.237862 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5588a3f7_4625_45c9_bf9c_3cb0732d44e0.slice/crio-2ed3468ec09c5000317ee60f244cd55df2aeecd341e1fa9f8a5813c02cacacd7 WatchSource:0}: Error finding container 2ed3468ec09c5000317ee60f244cd55df2aeecd341e1fa9f8a5813c02cacacd7: Status 404 returned error can't find the container with id 2ed3468ec09c5000317ee60f244cd55df2aeecd341e1fa9f8a5813c02cacacd7 Nov 22 03:31:13 crc kubenswrapper[4922]: I1122 03:31:13.337953 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerStarted","Data":"2ed3468ec09c5000317ee60f244cd55df2aeecd341e1fa9f8a5813c02cacacd7"} Nov 22 03:31:14 crc kubenswrapper[4922]: I1122 03:31:14.352220 4922 generic.go:334] "Generic (PLEG): container finished" podID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerID="b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902" exitCode=0 Nov 22 03:31:14 crc kubenswrapper[4922]: I1122 03:31:14.352302 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerDied","Data":"b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902"} Nov 22 03:31:14 crc kubenswrapper[4922]: I1122 03:31:14.358164 4922 generic.go:334] "Generic (PLEG): container finished" podID="85670af6-7929-40f9-8cb6-ef764c147917" containerID="c453d87abb8e5479d8d7f19f1440436773f999e0392ef756c95ab5d661214228" exitCode=0 Nov 22 03:31:14 crc kubenswrapper[4922]: I1122 03:31:14.358236 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" event={"ID":"85670af6-7929-40f9-8cb6-ef764c147917","Type":"ContainerDied","Data":"c453d87abb8e5479d8d7f19f1440436773f999e0392ef756c95ab5d661214228"} Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.378138 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerStarted","Data":"d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53"} Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.908174 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.972550 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n62z\" (UniqueName: \"kubernetes.io/projected/85670af6-7929-40f9-8cb6-ef764c147917-kube-api-access-6n62z\") pod \"85670af6-7929-40f9-8cb6-ef764c147917\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.972631 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-inventory\") pod \"85670af6-7929-40f9-8cb6-ef764c147917\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.972803 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ceph\") pod \"85670af6-7929-40f9-8cb6-ef764c147917\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.972939 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ssh-key\") pod \"85670af6-7929-40f9-8cb6-ef764c147917\" (UID: \"85670af6-7929-40f9-8cb6-ef764c147917\") " Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.984379 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ceph" (OuterVolumeSpecName: "ceph") pod "85670af6-7929-40f9-8cb6-ef764c147917" (UID: "85670af6-7929-40f9-8cb6-ef764c147917"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:15 crc kubenswrapper[4922]: I1122 03:31:15.984647 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85670af6-7929-40f9-8cb6-ef764c147917-kube-api-access-6n62z" (OuterVolumeSpecName: "kube-api-access-6n62z") pod "85670af6-7929-40f9-8cb6-ef764c147917" (UID: "85670af6-7929-40f9-8cb6-ef764c147917"). InnerVolumeSpecName "kube-api-access-6n62z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.015591 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "85670af6-7929-40f9-8cb6-ef764c147917" (UID: "85670af6-7929-40f9-8cb6-ef764c147917"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.020340 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-inventory" (OuterVolumeSpecName: "inventory") pod "85670af6-7929-40f9-8cb6-ef764c147917" (UID: "85670af6-7929-40f9-8cb6-ef764c147917"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.076292 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.076330 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.076344 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n62z\" (UniqueName: \"kubernetes.io/projected/85670af6-7929-40f9-8cb6-ef764c147917-kube-api-access-6n62z\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.076354 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/85670af6-7929-40f9-8cb6-ef764c147917-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.394654 4922 generic.go:334] "Generic (PLEG): container finished" podID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerID="d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53" exitCode=0 Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.394718 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerDied","Data":"d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53"} Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.397523 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" event={"ID":"85670af6-7929-40f9-8cb6-ef764c147917","Type":"ContainerDied","Data":"ec98f311bf19a0fb50d5a8b478e00ebb938f27b767465e021ba0c16d07bd4728"} Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.397567 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec98f311bf19a0fb50d5a8b478e00ebb938f27b767465e021ba0c16d07bd4728" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.397614 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-hnnvm" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.508995 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk"] Nov 22 03:31:16 crc kubenswrapper[4922]: E1122 03:31:16.509402 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85670af6-7929-40f9-8cb6-ef764c147917" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.509418 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="85670af6-7929-40f9-8cb6-ef764c147917" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.509760 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="85670af6-7929-40f9-8cb6-ef764c147917" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.510558 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.516589 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.516884 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.516907 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.517298 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.517322 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.543790 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk"] Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.587864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.587935 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.587991 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.588222 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqrjq\" (UniqueName: \"kubernetes.io/projected/b00860c0-4679-4cee-9185-94524381a6da-kube-api-access-lqrjq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.689955 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.690027 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.690079 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.690135 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqrjq\" (UniqueName: \"kubernetes.io/projected/b00860c0-4679-4cee-9185-94524381a6da-kube-api-access-lqrjq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.695621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.697834 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.699366 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.717654 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqrjq\" (UniqueName: \"kubernetes.io/projected/b00860c0-4679-4cee-9185-94524381a6da-kube-api-access-lqrjq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:16 crc kubenswrapper[4922]: I1122 03:31:16.846620 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:17 crc kubenswrapper[4922]: I1122 03:31:17.149288 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk"] Nov 22 03:31:17 crc kubenswrapper[4922]: W1122 03:31:17.157281 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00860c0_4679_4cee_9185_94524381a6da.slice/crio-d15fef1fd4ce1100a97c0ecc067f54812dd661cc8c58a1ab420b131d830740b1 WatchSource:0}: Error finding container d15fef1fd4ce1100a97c0ecc067f54812dd661cc8c58a1ab420b131d830740b1: Status 404 returned error can't find the container with id d15fef1fd4ce1100a97c0ecc067f54812dd661cc8c58a1ab420b131d830740b1 Nov 22 03:31:17 crc kubenswrapper[4922]: I1122 03:31:17.414768 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerStarted","Data":"931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5"} Nov 22 03:31:17 crc kubenswrapper[4922]: I1122 03:31:17.418662 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" event={"ID":"b00860c0-4679-4cee-9185-94524381a6da","Type":"ContainerStarted","Data":"d15fef1fd4ce1100a97c0ecc067f54812dd661cc8c58a1ab420b131d830740b1"} Nov 22 03:31:17 crc kubenswrapper[4922]: I1122 03:31:17.446890 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m57t7" podStartSLOduration=3.014089349 podStartE2EDuration="5.446835025s" podCreationTimestamp="2025-11-22 03:31:12 +0000 UTC" firstStartedPulling="2025-11-22 03:31:14.35489049 +0000 UTC m=+2310.393412422" lastFinishedPulling="2025-11-22 03:31:16.787636206 +0000 UTC m=+2312.826158098" observedRunningTime="2025-11-22 03:31:17.436002331 +0000 UTC m=+2313.474524223" watchObservedRunningTime="2025-11-22 03:31:17.446835025 +0000 UTC m=+2313.485356917" Nov 22 03:31:18 crc kubenswrapper[4922]: I1122 03:31:18.430319 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" event={"ID":"b00860c0-4679-4cee-9185-94524381a6da","Type":"ContainerStarted","Data":"92cbf569a07f469be0871bc5723f410ff9ef35a076c2a20c2541da8ed82d5944"} Nov 22 03:31:18 crc kubenswrapper[4922]: I1122 03:31:18.451736 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" podStartSLOduration=2.007124923 podStartE2EDuration="2.451720065s" podCreationTimestamp="2025-11-22 03:31:16 +0000 UTC" firstStartedPulling="2025-11-22 03:31:17.159571516 +0000 UTC m=+2313.198093418" lastFinishedPulling="2025-11-22 03:31:17.604166668 +0000 UTC m=+2313.642688560" observedRunningTime="2025-11-22 03:31:18.449673916 +0000 UTC m=+2314.488195798" watchObservedRunningTime="2025-11-22 03:31:18.451720065 +0000 UTC m=+2314.490241957" Nov 22 03:31:22 crc kubenswrapper[4922]: I1122 03:31:22.672507 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:22 crc kubenswrapper[4922]: I1122 03:31:22.673252 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:22 crc kubenswrapper[4922]: I1122 03:31:22.759677 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:23 crc kubenswrapper[4922]: I1122 03:31:23.566458 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:23 crc kubenswrapper[4922]: I1122 03:31:23.633505 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m57t7"] Nov 22 03:31:24 crc kubenswrapper[4922]: I1122 03:31:24.301143 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:31:24 crc kubenswrapper[4922]: E1122 03:31:24.302099 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:31:25 crc kubenswrapper[4922]: I1122 03:31:25.506547 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m57t7" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="registry-server" containerID="cri-o://931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5" gracePeriod=2 Nov 22 03:31:25 crc kubenswrapper[4922]: I1122 03:31:25.995633 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.090474 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-utilities\") pod \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.092482 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2tz9\" (UniqueName: \"kubernetes.io/projected/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-kube-api-access-q2tz9\") pod \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.092553 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-catalog-content\") pod \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\" (UID: \"5588a3f7-4625-45c9-bf9c-3cb0732d44e0\") " Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.092659 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-utilities" (OuterVolumeSpecName: "utilities") pod "5588a3f7-4625-45c9-bf9c-3cb0732d44e0" (UID: "5588a3f7-4625-45c9-bf9c-3cb0732d44e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.093419 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.099107 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-kube-api-access-q2tz9" (OuterVolumeSpecName: "kube-api-access-q2tz9") pod "5588a3f7-4625-45c9-bf9c-3cb0732d44e0" (UID: "5588a3f7-4625-45c9-bf9c-3cb0732d44e0"). InnerVolumeSpecName "kube-api-access-q2tz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.195430 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2tz9\" (UniqueName: \"kubernetes.io/projected/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-kube-api-access-q2tz9\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.518296 4922 generic.go:334] "Generic (PLEG): container finished" podID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerID="931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5" exitCode=0 Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.518354 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerDied","Data":"931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5"} Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.518401 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m57t7" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.518434 4922 scope.go:117] "RemoveContainer" containerID="931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.518414 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m57t7" event={"ID":"5588a3f7-4625-45c9-bf9c-3cb0732d44e0","Type":"ContainerDied","Data":"2ed3468ec09c5000317ee60f244cd55df2aeecd341e1fa9f8a5813c02cacacd7"} Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.547726 4922 scope.go:117] "RemoveContainer" containerID="d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.580719 4922 scope.go:117] "RemoveContainer" containerID="b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.637773 4922 scope.go:117] "RemoveContainer" containerID="931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5" Nov 22 03:31:26 crc kubenswrapper[4922]: E1122 03:31:26.638543 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5\": container with ID starting with 931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5 not found: ID does not exist" containerID="931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.638613 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5"} err="failed to get container status \"931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5\": rpc error: code = NotFound desc = could not find container \"931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5\": container with ID starting with 931bf7fac520fda79519bd2dc1a10ee1a4eef1b55a4fc22c9649ec9974bf2ae5 not found: ID does not exist" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.638693 4922 scope.go:117] "RemoveContainer" containerID="d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53" Nov 22 03:31:26 crc kubenswrapper[4922]: E1122 03:31:26.639326 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53\": container with ID starting with d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53 not found: ID does not exist" containerID="d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.639388 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53"} err="failed to get container status \"d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53\": rpc error: code = NotFound desc = could not find container \"d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53\": container with ID starting with d3d1753930af1984eb9afa7c55be9393239cf99cc4ad49087ca1798e4e29db53 not found: ID does not exist" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.639436 4922 scope.go:117] "RemoveContainer" containerID="b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902" Nov 22 03:31:26 crc kubenswrapper[4922]: E1122 03:31:26.639858 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902\": container with ID starting with b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902 not found: ID does not exist" containerID="b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.639901 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902"} err="failed to get container status \"b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902\": rpc error: code = NotFound desc = could not find container \"b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902\": container with ID starting with b33a911bc4b60e8b6a0871be1669ca416e8a530bd97544ff85a39548d8daf902 not found: ID does not exist" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.758893 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5588a3f7-4625-45c9-bf9c-3cb0732d44e0" (UID: "5588a3f7-4625-45c9-bf9c-3cb0732d44e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.809304 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5588a3f7-4625-45c9-bf9c-3cb0732d44e0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.873585 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m57t7"] Nov 22 03:31:26 crc kubenswrapper[4922]: I1122 03:31:26.883593 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m57t7"] Nov 22 03:31:27 crc kubenswrapper[4922]: I1122 03:31:27.319559 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" path="/var/lib/kubelet/pods/5588a3f7-4625-45c9-bf9c-3cb0732d44e0/volumes" Nov 22 03:31:28 crc kubenswrapper[4922]: I1122 03:31:28.548059 4922 generic.go:334] "Generic (PLEG): container finished" podID="b00860c0-4679-4cee-9185-94524381a6da" containerID="92cbf569a07f469be0871bc5723f410ff9ef35a076c2a20c2541da8ed82d5944" exitCode=0 Nov 22 03:31:28 crc kubenswrapper[4922]: I1122 03:31:28.548169 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" event={"ID":"b00860c0-4679-4cee-9185-94524381a6da","Type":"ContainerDied","Data":"92cbf569a07f469be0871bc5723f410ff9ef35a076c2a20c2541da8ed82d5944"} Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.059723 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.086481 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ssh-key\") pod \"b00860c0-4679-4cee-9185-94524381a6da\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.086547 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqrjq\" (UniqueName: \"kubernetes.io/projected/b00860c0-4679-4cee-9185-94524381a6da-kube-api-access-lqrjq\") pod \"b00860c0-4679-4cee-9185-94524381a6da\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.086593 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-inventory\") pod \"b00860c0-4679-4cee-9185-94524381a6da\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.086805 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ceph\") pod \"b00860c0-4679-4cee-9185-94524381a6da\" (UID: \"b00860c0-4679-4cee-9185-94524381a6da\") " Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.099089 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ceph" (OuterVolumeSpecName: "ceph") pod "b00860c0-4679-4cee-9185-94524381a6da" (UID: "b00860c0-4679-4cee-9185-94524381a6da"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.103938 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00860c0-4679-4cee-9185-94524381a6da-kube-api-access-lqrjq" (OuterVolumeSpecName: "kube-api-access-lqrjq") pod "b00860c0-4679-4cee-9185-94524381a6da" (UID: "b00860c0-4679-4cee-9185-94524381a6da"). InnerVolumeSpecName "kube-api-access-lqrjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.132018 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-inventory" (OuterVolumeSpecName: "inventory") pod "b00860c0-4679-4cee-9185-94524381a6da" (UID: "b00860c0-4679-4cee-9185-94524381a6da"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.143898 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b00860c0-4679-4cee-9185-94524381a6da" (UID: "b00860c0-4679-4cee-9185-94524381a6da"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.189173 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqrjq\" (UniqueName: \"kubernetes.io/projected/b00860c0-4679-4cee-9185-94524381a6da-kube-api-access-lqrjq\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.189222 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.189240 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.189259 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b00860c0-4679-4cee-9185-94524381a6da-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.572053 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" event={"ID":"b00860c0-4679-4cee-9185-94524381a6da","Type":"ContainerDied","Data":"d15fef1fd4ce1100a97c0ecc067f54812dd661cc8c58a1ab420b131d830740b1"} Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.572099 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.572140 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d15fef1fd4ce1100a97c0ecc067f54812dd661cc8c58a1ab420b131d830740b1" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.698354 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg"] Nov 22 03:31:30 crc kubenswrapper[4922]: E1122 03:31:30.699047 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="extract-utilities" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.699090 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="extract-utilities" Nov 22 03:31:30 crc kubenswrapper[4922]: E1122 03:31:30.699107 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="extract-content" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.699117 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="extract-content" Nov 22 03:31:30 crc kubenswrapper[4922]: E1122 03:31:30.699134 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00860c0-4679-4cee-9185-94524381a6da" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.699144 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00860c0-4679-4cee-9185-94524381a6da" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:30 crc kubenswrapper[4922]: E1122 03:31:30.699159 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="registry-server" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.699167 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="registry-server" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.699435 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00860c0-4679-4cee-9185-94524381a6da" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.699469 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5588a3f7-4625-45c9-bf9c-3cb0732d44e0" containerName="registry-server" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.700305 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.709368 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.710087 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.710272 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.710496 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.710758 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.711383 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.711697 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.713240 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.719599 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg"] Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801106 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801158 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801191 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801241 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801300 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzq7x\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-kube-api-access-kzq7x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801329 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801373 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801399 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801580 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801670 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801728 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.801869 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.904806 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.904972 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905084 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905147 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905206 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905272 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905386 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzq7x\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-kube-api-access-kzq7x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905449 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905505 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905550 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905714 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905790 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.905909 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.912033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.913045 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.913984 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.914280 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.914762 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.915119 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.915338 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.916225 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.916520 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.917188 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.917405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.917922 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:30 crc kubenswrapper[4922]: I1122 03:31:30.936502 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzq7x\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-kube-api-access-kzq7x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j66gg\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:31 crc kubenswrapper[4922]: I1122 03:31:31.035221 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:31:31 crc kubenswrapper[4922]: I1122 03:31:31.670772 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:31:31 crc kubenswrapper[4922]: I1122 03:31:31.680005 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg"] Nov 22 03:31:32 crc kubenswrapper[4922]: I1122 03:31:32.593435 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" event={"ID":"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c","Type":"ContainerStarted","Data":"36c0281439eb71461e7f44c3970e094018e3e29a1c80ea311899cf1c91570ef9"} Nov 22 03:31:32 crc kubenswrapper[4922]: I1122 03:31:32.593951 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" event={"ID":"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c","Type":"ContainerStarted","Data":"c0bf859eeda2e662b107e2feb2572e317fc1a9701533308e05e79a50b4ad705e"} Nov 22 03:31:32 crc kubenswrapper[4922]: I1122 03:31:32.618539 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" podStartSLOduration=2.150979643 podStartE2EDuration="2.618516523s" podCreationTimestamp="2025-11-22 03:31:30 +0000 UTC" firstStartedPulling="2025-11-22 03:31:31.670492237 +0000 UTC m=+2327.709014139" lastFinishedPulling="2025-11-22 03:31:32.138029127 +0000 UTC m=+2328.176551019" observedRunningTime="2025-11-22 03:31:32.613699046 +0000 UTC m=+2328.652220968" watchObservedRunningTime="2025-11-22 03:31:32.618516523 +0000 UTC m=+2328.657038435" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.333909 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fsg9l"] Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.338603 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.381632 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsg9l"] Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.484430 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-catalog-content\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.484764 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-utilities\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.484800 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2bx\" (UniqueName: \"kubernetes.io/projected/e172147e-cb2e-4b10-ae16-2d46ce465423-kube-api-access-8s2bx\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.586543 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-catalog-content\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.586732 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-utilities\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.586756 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s2bx\" (UniqueName: \"kubernetes.io/projected/e172147e-cb2e-4b10-ae16-2d46ce465423-kube-api-access-8s2bx\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.587133 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-catalog-content\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.587550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-utilities\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.616027 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s2bx\" (UniqueName: \"kubernetes.io/projected/e172147e-cb2e-4b10-ae16-2d46ce465423-kube-api-access-8s2bx\") pod \"redhat-marketplace-fsg9l\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:34 crc kubenswrapper[4922]: I1122 03:31:34.706101 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:35 crc kubenswrapper[4922]: W1122 03:31:35.182466 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode172147e_cb2e_4b10_ae16_2d46ce465423.slice/crio-639e752d20cd8f6f336e56dfbec161a20f1d1045f9b1b3382e0fd303e2fcb8a5 WatchSource:0}: Error finding container 639e752d20cd8f6f336e56dfbec161a20f1d1045f9b1b3382e0fd303e2fcb8a5: Status 404 returned error can't find the container with id 639e752d20cd8f6f336e56dfbec161a20f1d1045f9b1b3382e0fd303e2fcb8a5 Nov 22 03:31:35 crc kubenswrapper[4922]: I1122 03:31:35.197907 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsg9l"] Nov 22 03:31:35 crc kubenswrapper[4922]: I1122 03:31:35.305699 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:31:35 crc kubenswrapper[4922]: E1122 03:31:35.305955 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:31:35 crc kubenswrapper[4922]: I1122 03:31:35.625216 4922 generic.go:334] "Generic (PLEG): container finished" podID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerID="7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe" exitCode=0 Nov 22 03:31:35 crc kubenswrapper[4922]: I1122 03:31:35.625291 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsg9l" event={"ID":"e172147e-cb2e-4b10-ae16-2d46ce465423","Type":"ContainerDied","Data":"7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe"} Nov 22 03:31:35 crc kubenswrapper[4922]: I1122 03:31:35.625591 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsg9l" event={"ID":"e172147e-cb2e-4b10-ae16-2d46ce465423","Type":"ContainerStarted","Data":"639e752d20cd8f6f336e56dfbec161a20f1d1045f9b1b3382e0fd303e2fcb8a5"} Nov 22 03:31:37 crc kubenswrapper[4922]: I1122 03:31:37.914980 4922 generic.go:334] "Generic (PLEG): container finished" podID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerID="e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e" exitCode=0 Nov 22 03:31:37 crc kubenswrapper[4922]: I1122 03:31:37.915018 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsg9l" event={"ID":"e172147e-cb2e-4b10-ae16-2d46ce465423","Type":"ContainerDied","Data":"e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e"} Nov 22 03:31:38 crc kubenswrapper[4922]: I1122 03:31:38.928321 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsg9l" event={"ID":"e172147e-cb2e-4b10-ae16-2d46ce465423","Type":"ContainerStarted","Data":"3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb"} Nov 22 03:31:38 crc kubenswrapper[4922]: I1122 03:31:38.955902 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fsg9l" podStartSLOduration=2.22157372 podStartE2EDuration="4.955879132s" podCreationTimestamp="2025-11-22 03:31:34 +0000 UTC" firstStartedPulling="2025-11-22 03:31:35.629192018 +0000 UTC m=+2331.667713910" lastFinishedPulling="2025-11-22 03:31:38.3634974 +0000 UTC m=+2334.402019322" observedRunningTime="2025-11-22 03:31:38.953099835 +0000 UTC m=+2334.991621747" watchObservedRunningTime="2025-11-22 03:31:38.955879132 +0000 UTC m=+2334.994401034" Nov 22 03:31:44 crc kubenswrapper[4922]: I1122 03:31:44.706567 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:44 crc kubenswrapper[4922]: I1122 03:31:44.707349 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:44 crc kubenswrapper[4922]: I1122 03:31:44.786459 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:45 crc kubenswrapper[4922]: I1122 03:31:45.063006 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:45 crc kubenswrapper[4922]: I1122 03:31:45.135447 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsg9l"] Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.019717 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fsg9l" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="registry-server" containerID="cri-o://3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb" gracePeriod=2 Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.300280 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:31:47 crc kubenswrapper[4922]: E1122 03:31:47.301005 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.498173 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.616177 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s2bx\" (UniqueName: \"kubernetes.io/projected/e172147e-cb2e-4b10-ae16-2d46ce465423-kube-api-access-8s2bx\") pod \"e172147e-cb2e-4b10-ae16-2d46ce465423\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.616382 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-catalog-content\") pod \"e172147e-cb2e-4b10-ae16-2d46ce465423\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.616623 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-utilities\") pod \"e172147e-cb2e-4b10-ae16-2d46ce465423\" (UID: \"e172147e-cb2e-4b10-ae16-2d46ce465423\") " Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.620941 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-utilities" (OuterVolumeSpecName: "utilities") pod "e172147e-cb2e-4b10-ae16-2d46ce465423" (UID: "e172147e-cb2e-4b10-ae16-2d46ce465423"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.624765 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e172147e-cb2e-4b10-ae16-2d46ce465423-kube-api-access-8s2bx" (OuterVolumeSpecName: "kube-api-access-8s2bx") pod "e172147e-cb2e-4b10-ae16-2d46ce465423" (UID: "e172147e-cb2e-4b10-ae16-2d46ce465423"). InnerVolumeSpecName "kube-api-access-8s2bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.645868 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e172147e-cb2e-4b10-ae16-2d46ce465423" (UID: "e172147e-cb2e-4b10-ae16-2d46ce465423"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.719968 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s2bx\" (UniqueName: \"kubernetes.io/projected/e172147e-cb2e-4b10-ae16-2d46ce465423-kube-api-access-8s2bx\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.720035 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:47 crc kubenswrapper[4922]: I1122 03:31:47.720063 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e172147e-cb2e-4b10-ae16-2d46ce465423-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.030895 4922 generic.go:334] "Generic (PLEG): container finished" podID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerID="3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb" exitCode=0 Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.031021 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fsg9l" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.031045 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsg9l" event={"ID":"e172147e-cb2e-4b10-ae16-2d46ce465423","Type":"ContainerDied","Data":"3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb"} Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.031472 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fsg9l" event={"ID":"e172147e-cb2e-4b10-ae16-2d46ce465423","Type":"ContainerDied","Data":"639e752d20cd8f6f336e56dfbec161a20f1d1045f9b1b3382e0fd303e2fcb8a5"} Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.031515 4922 scope.go:117] "RemoveContainer" containerID="3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.071378 4922 scope.go:117] "RemoveContainer" containerID="e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.088658 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsg9l"] Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.096823 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fsg9l"] Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.100169 4922 scope.go:117] "RemoveContainer" containerID="7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.160185 4922 scope.go:117] "RemoveContainer" containerID="3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb" Nov 22 03:31:48 crc kubenswrapper[4922]: E1122 03:31:48.182250 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb\": container with ID starting with 3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb not found: ID does not exist" containerID="3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.182294 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb"} err="failed to get container status \"3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb\": rpc error: code = NotFound desc = could not find container \"3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb\": container with ID starting with 3fe6f7ea6655ef40c5ad58e82f05975d471b1c8a59082950d6183300046350fb not found: ID does not exist" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.182324 4922 scope.go:117] "RemoveContainer" containerID="e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e" Nov 22 03:31:48 crc kubenswrapper[4922]: E1122 03:31:48.182737 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e\": container with ID starting with e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e not found: ID does not exist" containerID="e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.182768 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e"} err="failed to get container status \"e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e\": rpc error: code = NotFound desc = could not find container \"e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e\": container with ID starting with e46a99354184699beb0a334c0a37450ad583a51253caf3bbeedc01bf44dfee4e not found: ID does not exist" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.182788 4922 scope.go:117] "RemoveContainer" containerID="7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe" Nov 22 03:31:48 crc kubenswrapper[4922]: E1122 03:31:48.183223 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe\": container with ID starting with 7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe not found: ID does not exist" containerID="7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe" Nov 22 03:31:48 crc kubenswrapper[4922]: I1122 03:31:48.183274 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe"} err="failed to get container status \"7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe\": rpc error: code = NotFound desc = could not find container \"7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe\": container with ID starting with 7318ea7234c71568b0dcb88ba1e0f2f7938f81bb7aff333fda2f4e67a9933cbe not found: ID does not exist" Nov 22 03:31:49 crc kubenswrapper[4922]: I1122 03:31:49.318389 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" path="/var/lib/kubelet/pods/e172147e-cb2e-4b10-ae16-2d46ce465423/volumes" Nov 22 03:31:58 crc kubenswrapper[4922]: I1122 03:31:58.300970 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:31:58 crc kubenswrapper[4922]: E1122 03:31:58.302072 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:32:04 crc kubenswrapper[4922]: I1122 03:32:04.227056 4922 generic.go:334] "Generic (PLEG): container finished" podID="fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" containerID="36c0281439eb71461e7f44c3970e094018e3e29a1c80ea311899cf1c91570ef9" exitCode=0 Nov 22 03:32:04 crc kubenswrapper[4922]: I1122 03:32:04.227169 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" event={"ID":"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c","Type":"ContainerDied","Data":"36c0281439eb71461e7f44c3970e094018e3e29a1c80ea311899cf1c91570ef9"} Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.735688 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802672 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-neutron-metadata-combined-ca-bundle\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802721 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-inventory\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802746 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-libvirt-combined-ca-bundle\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802767 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ovn-combined-ca-bundle\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802795 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ssh-key\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802836 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802869 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-repo-setup-combined-ca-bundle\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802890 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802919 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-nova-combined-ca-bundle\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802947 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-bootstrap-combined-ca-bundle\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802963 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzq7x\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-kube-api-access-kzq7x\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.802991 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-ovn-default-certs-0\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.803014 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ceph\") pod \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\" (UID: \"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c\") " Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.808996 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.809499 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.810331 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.810414 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.810524 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.811381 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.811502 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.811717 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-kube-api-access-kzq7x" (OuterVolumeSpecName: "kube-api-access-kzq7x") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "kube-api-access-kzq7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.812060 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.814692 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ceph" (OuterVolumeSpecName: "ceph") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.825028 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.832956 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.841110 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-inventory" (OuterVolumeSpecName: "inventory") pod "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" (UID: "fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.910604 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.921700 4922 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.921883 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.922090 4922 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.922359 4922 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.922407 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzq7x\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-kube-api-access-kzq7x\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.922426 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.922800 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.922963 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.923029 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.923120 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.923218 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:05 crc kubenswrapper[4922]: I1122 03:32:05.923298 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.252114 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" event={"ID":"fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c","Type":"ContainerDied","Data":"c0bf859eeda2e662b107e2feb2572e317fc1a9701533308e05e79a50b4ad705e"} Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.252418 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0bf859eeda2e662b107e2feb2572e317fc1a9701533308e05e79a50b4ad705e" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.252285 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j66gg" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394314 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw"] Nov 22 03:32:06 crc kubenswrapper[4922]: E1122 03:32:06.394640 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="registry-server" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394658 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="registry-server" Nov 22 03:32:06 crc kubenswrapper[4922]: E1122 03:32:06.394666 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="extract-content" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394672 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="extract-content" Nov 22 03:32:06 crc kubenswrapper[4922]: E1122 03:32:06.394706 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394714 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:06 crc kubenswrapper[4922]: E1122 03:32:06.394726 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="extract-utilities" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394732 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="extract-utilities" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394934 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e172147e-cb2e-4b10-ae16-2d46ce465423" containerName="registry-server" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.394948 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.395622 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.397929 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.399449 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.399734 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.399943 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.400596 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.415023 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw"] Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.442423 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.442596 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.442637 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.442719 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4bwk\" (UniqueName: \"kubernetes.io/projected/6844c188-29eb-4010-a96a-427689e010e9-kube-api-access-d4bwk\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.544548 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.544697 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.544769 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.544897 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4bwk\" (UniqueName: \"kubernetes.io/projected/6844c188-29eb-4010-a96a-427689e010e9-kube-api-access-d4bwk\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.550465 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.550572 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.552044 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.567778 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4bwk\" (UniqueName: \"kubernetes.io/projected/6844c188-29eb-4010-a96a-427689e010e9-kube-api-access-d4bwk\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:06 crc kubenswrapper[4922]: I1122 03:32:06.725341 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:07 crc kubenswrapper[4922]: I1122 03:32:07.125779 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw"] Nov 22 03:32:07 crc kubenswrapper[4922]: I1122 03:32:07.264364 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" event={"ID":"6844c188-29eb-4010-a96a-427689e010e9","Type":"ContainerStarted","Data":"02667c78e2ab75c52ed60c36c605d0124f6e306a6274c0a191ac440d882cac44"} Nov 22 03:32:08 crc kubenswrapper[4922]: I1122 03:32:08.275652 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" event={"ID":"6844c188-29eb-4010-a96a-427689e010e9","Type":"ContainerStarted","Data":"71584078f5bfa207c3553f43b5640b12a502b3543c3e726f893c6dd686c625ad"} Nov 22 03:32:08 crc kubenswrapper[4922]: I1122 03:32:08.296071 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" podStartSLOduration=1.87741095 podStartE2EDuration="2.296052149s" podCreationTimestamp="2025-11-22 03:32:06 +0000 UTC" firstStartedPulling="2025-11-22 03:32:07.138894808 +0000 UTC m=+2363.177416740" lastFinishedPulling="2025-11-22 03:32:07.557536027 +0000 UTC m=+2363.596057939" observedRunningTime="2025-11-22 03:32:08.288609417 +0000 UTC m=+2364.327131319" watchObservedRunningTime="2025-11-22 03:32:08.296052149 +0000 UTC m=+2364.334574041" Nov 22 03:32:10 crc kubenswrapper[4922]: I1122 03:32:10.301008 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:32:10 crc kubenswrapper[4922]: E1122 03:32:10.302281 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:32:13 crc kubenswrapper[4922]: I1122 03:32:13.338054 4922 generic.go:334] "Generic (PLEG): container finished" podID="6844c188-29eb-4010-a96a-427689e010e9" containerID="71584078f5bfa207c3553f43b5640b12a502b3543c3e726f893c6dd686c625ad" exitCode=0 Nov 22 03:32:13 crc kubenswrapper[4922]: I1122 03:32:13.343675 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" event={"ID":"6844c188-29eb-4010-a96a-427689e010e9","Type":"ContainerDied","Data":"71584078f5bfa207c3553f43b5640b12a502b3543c3e726f893c6dd686c625ad"} Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.834593 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.900501 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ssh-key\") pod \"6844c188-29eb-4010-a96a-427689e010e9\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.900648 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4bwk\" (UniqueName: \"kubernetes.io/projected/6844c188-29eb-4010-a96a-427689e010e9-kube-api-access-d4bwk\") pod \"6844c188-29eb-4010-a96a-427689e010e9\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.901214 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-inventory\") pod \"6844c188-29eb-4010-a96a-427689e010e9\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.901280 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ceph\") pod \"6844c188-29eb-4010-a96a-427689e010e9\" (UID: \"6844c188-29eb-4010-a96a-427689e010e9\") " Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.920927 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ceph" (OuterVolumeSpecName: "ceph") pod "6844c188-29eb-4010-a96a-427689e010e9" (UID: "6844c188-29eb-4010-a96a-427689e010e9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.920947 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6844c188-29eb-4010-a96a-427689e010e9-kube-api-access-d4bwk" (OuterVolumeSpecName: "kube-api-access-d4bwk") pod "6844c188-29eb-4010-a96a-427689e010e9" (UID: "6844c188-29eb-4010-a96a-427689e010e9"). InnerVolumeSpecName "kube-api-access-d4bwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.936119 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-inventory" (OuterVolumeSpecName: "inventory") pod "6844c188-29eb-4010-a96a-427689e010e9" (UID: "6844c188-29eb-4010-a96a-427689e010e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:14 crc kubenswrapper[4922]: I1122 03:32:14.941800 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6844c188-29eb-4010-a96a-427689e010e9" (UID: "6844c188-29eb-4010-a96a-427689e010e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.003653 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.003699 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.003711 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6844c188-29eb-4010-a96a-427689e010e9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.003724 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4bwk\" (UniqueName: \"kubernetes.io/projected/6844c188-29eb-4010-a96a-427689e010e9-kube-api-access-d4bwk\") on node \"crc\" DevicePath \"\"" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.360377 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" event={"ID":"6844c188-29eb-4010-a96a-427689e010e9","Type":"ContainerDied","Data":"02667c78e2ab75c52ed60c36c605d0124f6e306a6274c0a191ac440d882cac44"} Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.360443 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02667c78e2ab75c52ed60c36c605d0124f6e306a6274c0a191ac440d882cac44" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.360518 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.458257 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc"] Nov 22 03:32:15 crc kubenswrapper[4922]: E1122 03:32:15.458715 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6844c188-29eb-4010-a96a-427689e010e9" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.458760 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6844c188-29eb-4010-a96a-427689e010e9" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.459001 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6844c188-29eb-4010-a96a-427689e010e9" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.465604 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.475477 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.475529 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.475749 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.475932 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.476017 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.476087 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.483418 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc"] Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.514104 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.514209 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.514287 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.514354 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.514390 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvvj\" (UniqueName: \"kubernetes.io/projected/c5159e8a-3369-45c5-b6e1-a45e3e49a228-kube-api-access-pgvvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.514526 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.615169 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.615230 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvvj\" (UniqueName: \"kubernetes.io/projected/c5159e8a-3369-45c5-b6e1-a45e3e49a228-kube-api-access-pgvvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.615287 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.615327 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.615364 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.615408 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.617045 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.618932 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.619670 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.620037 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.620275 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.634516 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvvj\" (UniqueName: \"kubernetes.io/projected/c5159e8a-3369-45c5-b6e1-a45e3e49a228-kube-api-access-pgvvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6l4jc\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:15 crc kubenswrapper[4922]: I1122 03:32:15.795077 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:32:16 crc kubenswrapper[4922]: I1122 03:32:16.320523 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc"] Nov 22 03:32:16 crc kubenswrapper[4922]: I1122 03:32:16.373779 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" event={"ID":"c5159e8a-3369-45c5-b6e1-a45e3e49a228","Type":"ContainerStarted","Data":"471fc4aa708ee5952c5ef99feeca4ba6be22a62bc731e3ddbc3b45d519b83ba8"} Nov 22 03:32:17 crc kubenswrapper[4922]: I1122 03:32:17.391135 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" event={"ID":"c5159e8a-3369-45c5-b6e1-a45e3e49a228","Type":"ContainerStarted","Data":"7c70da69d6ad24d85ec658df21859f7f9c10cae83faf05e1a3cd8bcc596b662e"} Nov 22 03:32:17 crc kubenswrapper[4922]: I1122 03:32:17.418244 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" podStartSLOduration=1.980238231 podStartE2EDuration="2.418222281s" podCreationTimestamp="2025-11-22 03:32:15 +0000 UTC" firstStartedPulling="2025-11-22 03:32:16.324999478 +0000 UTC m=+2372.363521370" lastFinishedPulling="2025-11-22 03:32:16.762983488 +0000 UTC m=+2372.801505420" observedRunningTime="2025-11-22 03:32:17.413205098 +0000 UTC m=+2373.451727030" watchObservedRunningTime="2025-11-22 03:32:17.418222281 +0000 UTC m=+2373.456744193" Nov 22 03:32:25 crc kubenswrapper[4922]: I1122 03:32:25.311412 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:32:25 crc kubenswrapper[4922]: E1122 03:32:25.312612 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:32:36 crc kubenswrapper[4922]: I1122 03:32:36.301391 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:32:36 crc kubenswrapper[4922]: E1122 03:32:36.302677 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:32:51 crc kubenswrapper[4922]: I1122 03:32:51.303346 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:32:51 crc kubenswrapper[4922]: E1122 03:32:51.307706 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.334885 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7rkcc"] Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.338158 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.344126 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7rkcc"] Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.506407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbfzq\" (UniqueName: \"kubernetes.io/projected/077252d4-d95a-45b0-908d-c086ad707981-kube-api-access-jbfzq\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.506481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-utilities\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.506600 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-catalog-content\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.608123 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbfzq\" (UniqueName: \"kubernetes.io/projected/077252d4-d95a-45b0-908d-c086ad707981-kube-api-access-jbfzq\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.608191 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-utilities\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.608287 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-catalog-content\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.608949 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-catalog-content\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.609023 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-utilities\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.635287 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbfzq\" (UniqueName: \"kubernetes.io/projected/077252d4-d95a-45b0-908d-c086ad707981-kube-api-access-jbfzq\") pod \"certified-operators-7rkcc\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:52 crc kubenswrapper[4922]: I1122 03:32:52.664322 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:32:53 crc kubenswrapper[4922]: I1122 03:32:53.156817 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7rkcc"] Nov 22 03:32:53 crc kubenswrapper[4922]: I1122 03:32:53.787028 4922 generic.go:334] "Generic (PLEG): container finished" podID="077252d4-d95a-45b0-908d-c086ad707981" containerID="02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b" exitCode=0 Nov 22 03:32:53 crc kubenswrapper[4922]: I1122 03:32:53.787131 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerDied","Data":"02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b"} Nov 22 03:32:53 crc kubenswrapper[4922]: I1122 03:32:53.787476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerStarted","Data":"76fabcc248aab9f7f4b44d45f1be52d2aeb83e8a09ac899b63bebfa39305d454"} Nov 22 03:32:54 crc kubenswrapper[4922]: I1122 03:32:54.801784 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerStarted","Data":"403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d"} Nov 22 03:32:55 crc kubenswrapper[4922]: I1122 03:32:55.816303 4922 generic.go:334] "Generic (PLEG): container finished" podID="077252d4-d95a-45b0-908d-c086ad707981" containerID="403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d" exitCode=0 Nov 22 03:32:55 crc kubenswrapper[4922]: I1122 03:32:55.816368 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerDied","Data":"403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d"} Nov 22 03:32:56 crc kubenswrapper[4922]: I1122 03:32:56.829396 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerStarted","Data":"54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced"} Nov 22 03:32:56 crc kubenswrapper[4922]: I1122 03:32:56.869716 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7rkcc" podStartSLOduration=2.420919992 podStartE2EDuration="4.869684928s" podCreationTimestamp="2025-11-22 03:32:52 +0000 UTC" firstStartedPulling="2025-11-22 03:32:53.789475069 +0000 UTC m=+2409.827996981" lastFinishedPulling="2025-11-22 03:32:56.238239985 +0000 UTC m=+2412.276761917" observedRunningTime="2025-11-22 03:32:56.853986215 +0000 UTC m=+2412.892508177" watchObservedRunningTime="2025-11-22 03:32:56.869684928 +0000 UTC m=+2412.908206880" Nov 22 03:33:02 crc kubenswrapper[4922]: I1122 03:33:02.666701 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:33:02 crc kubenswrapper[4922]: I1122 03:33:02.667275 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:33:02 crc kubenswrapper[4922]: I1122 03:33:02.739906 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:33:02 crc kubenswrapper[4922]: I1122 03:33:02.956163 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:33:03 crc kubenswrapper[4922]: I1122 03:33:03.005227 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7rkcc"] Nov 22 03:33:03 crc kubenswrapper[4922]: I1122 03:33:03.300540 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:33:03 crc kubenswrapper[4922]: E1122 03:33:03.301003 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:33:04 crc kubenswrapper[4922]: I1122 03:33:04.913166 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7rkcc" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="registry-server" containerID="cri-o://54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced" gracePeriod=2 Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.443463 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.588599 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbfzq\" (UniqueName: \"kubernetes.io/projected/077252d4-d95a-45b0-908d-c086ad707981-kube-api-access-jbfzq\") pod \"077252d4-d95a-45b0-908d-c086ad707981\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.588785 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-catalog-content\") pod \"077252d4-d95a-45b0-908d-c086ad707981\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.588882 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-utilities\") pod \"077252d4-d95a-45b0-908d-c086ad707981\" (UID: \"077252d4-d95a-45b0-908d-c086ad707981\") " Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.589803 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-utilities" (OuterVolumeSpecName: "utilities") pod "077252d4-d95a-45b0-908d-c086ad707981" (UID: "077252d4-d95a-45b0-908d-c086ad707981"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.598152 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077252d4-d95a-45b0-908d-c086ad707981-kube-api-access-jbfzq" (OuterVolumeSpecName: "kube-api-access-jbfzq") pod "077252d4-d95a-45b0-908d-c086ad707981" (UID: "077252d4-d95a-45b0-908d-c086ad707981"). InnerVolumeSpecName "kube-api-access-jbfzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.641365 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "077252d4-d95a-45b0-908d-c086ad707981" (UID: "077252d4-d95a-45b0-908d-c086ad707981"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.692311 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbfzq\" (UniqueName: \"kubernetes.io/projected/077252d4-d95a-45b0-908d-c086ad707981-kube-api-access-jbfzq\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.692384 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.692413 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/077252d4-d95a-45b0-908d-c086ad707981-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.929449 4922 generic.go:334] "Generic (PLEG): container finished" podID="077252d4-d95a-45b0-908d-c086ad707981" containerID="54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced" exitCode=0 Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.929509 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerDied","Data":"54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced"} Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.929555 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7rkcc" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.929580 4922 scope.go:117] "RemoveContainer" containerID="54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced" Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.929561 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7rkcc" event={"ID":"077252d4-d95a-45b0-908d-c086ad707981","Type":"ContainerDied","Data":"76fabcc248aab9f7f4b44d45f1be52d2aeb83e8a09ac899b63bebfa39305d454"} Nov 22 03:33:05 crc kubenswrapper[4922]: I1122 03:33:05.963251 4922 scope.go:117] "RemoveContainer" containerID="403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.000452 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7rkcc"] Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.013257 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7rkcc"] Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.014250 4922 scope.go:117] "RemoveContainer" containerID="02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.076116 4922 scope.go:117] "RemoveContainer" containerID="54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced" Nov 22 03:33:06 crc kubenswrapper[4922]: E1122 03:33:06.076754 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced\": container with ID starting with 54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced not found: ID does not exist" containerID="54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.076813 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced"} err="failed to get container status \"54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced\": rpc error: code = NotFound desc = could not find container \"54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced\": container with ID starting with 54f145097e7ff3da7c621bbf94cf9d130677103b00a644a736370459f59ccced not found: ID does not exist" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.076864 4922 scope.go:117] "RemoveContainer" containerID="403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d" Nov 22 03:33:06 crc kubenswrapper[4922]: E1122 03:33:06.077348 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d\": container with ID starting with 403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d not found: ID does not exist" containerID="403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.077401 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d"} err="failed to get container status \"403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d\": rpc error: code = NotFound desc = could not find container \"403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d\": container with ID starting with 403141772ff2e17fa6f86a2ec0a109e75e86b9a893377a534aa510a9943f142d not found: ID does not exist" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.077438 4922 scope.go:117] "RemoveContainer" containerID="02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b" Nov 22 03:33:06 crc kubenswrapper[4922]: E1122 03:33:06.077820 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b\": container with ID starting with 02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b not found: ID does not exist" containerID="02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b" Nov 22 03:33:06 crc kubenswrapper[4922]: I1122 03:33:06.077865 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b"} err="failed to get container status \"02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b\": rpc error: code = NotFound desc = could not find container \"02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b\": container with ID starting with 02286140bcf619632b877abb2532f395ad2ab6cfe42aaa8802da703e6a97d59b not found: ID does not exist" Nov 22 03:33:07 crc kubenswrapper[4922]: I1122 03:33:07.314956 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077252d4-d95a-45b0-908d-c086ad707981" path="/var/lib/kubelet/pods/077252d4-d95a-45b0-908d-c086ad707981/volumes" Nov 22 03:33:17 crc kubenswrapper[4922]: I1122 03:33:17.310384 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:33:17 crc kubenswrapper[4922]: E1122 03:33:17.311054 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:33:30 crc kubenswrapper[4922]: I1122 03:33:30.301200 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:33:30 crc kubenswrapper[4922]: E1122 03:33:30.302457 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:33:32 crc kubenswrapper[4922]: I1122 03:33:32.218892 4922 generic.go:334] "Generic (PLEG): container finished" podID="c5159e8a-3369-45c5-b6e1-a45e3e49a228" containerID="7c70da69d6ad24d85ec658df21859f7f9c10cae83faf05e1a3cd8bcc596b662e" exitCode=0 Nov 22 03:33:32 crc kubenswrapper[4922]: I1122 03:33:32.219137 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" event={"ID":"c5159e8a-3369-45c5-b6e1-a45e3e49a228","Type":"ContainerDied","Data":"7c70da69d6ad24d85ec658df21859f7f9c10cae83faf05e1a3cd8bcc596b662e"} Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.790250 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.937246 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-inventory\") pod \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.937745 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ceph\") pod \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.937822 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgvvj\" (UniqueName: \"kubernetes.io/projected/c5159e8a-3369-45c5-b6e1-a45e3e49a228-kube-api-access-pgvvj\") pod \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.937957 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovncontroller-config-0\") pod \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.938130 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ssh-key\") pod \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.938175 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovn-combined-ca-bundle\") pod \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\" (UID: \"c5159e8a-3369-45c5-b6e1-a45e3e49a228\") " Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.945079 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c5159e8a-3369-45c5-b6e1-a45e3e49a228" (UID: "c5159e8a-3369-45c5-b6e1-a45e3e49a228"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.951917 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5159e8a-3369-45c5-b6e1-a45e3e49a228-kube-api-access-pgvvj" (OuterVolumeSpecName: "kube-api-access-pgvvj") pod "c5159e8a-3369-45c5-b6e1-a45e3e49a228" (UID: "c5159e8a-3369-45c5-b6e1-a45e3e49a228"). InnerVolumeSpecName "kube-api-access-pgvvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.952031 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ceph" (OuterVolumeSpecName: "ceph") pod "c5159e8a-3369-45c5-b6e1-a45e3e49a228" (UID: "c5159e8a-3369-45c5-b6e1-a45e3e49a228"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.975804 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c5159e8a-3369-45c5-b6e1-a45e3e49a228" (UID: "c5159e8a-3369-45c5-b6e1-a45e3e49a228"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.988633 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-inventory" (OuterVolumeSpecName: "inventory") pod "c5159e8a-3369-45c5-b6e1-a45e3e49a228" (UID: "c5159e8a-3369-45c5-b6e1-a45e3e49a228"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:33 crc kubenswrapper[4922]: I1122 03:33:33.998419 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5159e8a-3369-45c5-b6e1-a45e3e49a228" (UID: "c5159e8a-3369-45c5-b6e1-a45e3e49a228"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.042180 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.042524 4922 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.042786 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.042968 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.043096 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgvvj\" (UniqueName: \"kubernetes.io/projected/c5159e8a-3369-45c5-b6e1-a45e3e49a228-kube-api-access-pgvvj\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.043218 4922 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c5159e8a-3369-45c5-b6e1-a45e3e49a228-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.243514 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" event={"ID":"c5159e8a-3369-45c5-b6e1-a45e3e49a228","Type":"ContainerDied","Data":"471fc4aa708ee5952c5ef99feeca4ba6be22a62bc731e3ddbc3b45d519b83ba8"} Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.243571 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="471fc4aa708ee5952c5ef99feeca4ba6be22a62bc731e3ddbc3b45d519b83ba8" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.244028 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6l4jc" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.389804 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt"] Nov 22 03:33:34 crc kubenswrapper[4922]: E1122 03:33:34.391316 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="extract-content" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.391503 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="extract-content" Nov 22 03:33:34 crc kubenswrapper[4922]: E1122 03:33:34.391656 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="extract-utilities" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.391811 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="extract-utilities" Nov 22 03:33:34 crc kubenswrapper[4922]: E1122 03:33:34.392000 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="registry-server" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.392138 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="registry-server" Nov 22 03:33:34 crc kubenswrapper[4922]: E1122 03:33:34.392301 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5159e8a-3369-45c5-b6e1-a45e3e49a228" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.392468 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5159e8a-3369-45c5-b6e1-a45e3e49a228" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.392999 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="077252d4-d95a-45b0-908d-c086ad707981" containerName="registry-server" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.396122 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5159e8a-3369-45c5-b6e1-a45e3e49a228" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.397255 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.400166 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.400631 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.401336 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.401426 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.401827 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.402457 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.402813 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.410478 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt"] Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.451758 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.451831 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.452193 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.452308 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.452408 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldkjh\" (UniqueName: \"kubernetes.io/projected/1054bc07-486a-48ac-9199-49bae8794a90-kube-api-access-ldkjh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.452708 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.453064 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555197 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555331 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555458 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555516 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555572 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldkjh\" (UniqueName: \"kubernetes.io/projected/1054bc07-486a-48ac-9199-49bae8794a90-kube-api-access-ldkjh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.555646 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.560661 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.562593 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.568321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.569880 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.570532 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.570970 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.588756 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldkjh\" (UniqueName: \"kubernetes.io/projected/1054bc07-486a-48ac-9199-49bae8794a90-kube-api-access-ldkjh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:34 crc kubenswrapper[4922]: I1122 03:33:34.738328 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:33:35 crc kubenswrapper[4922]: I1122 03:33:35.344468 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt"] Nov 22 03:33:35 crc kubenswrapper[4922]: W1122 03:33:35.345841 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1054bc07_486a_48ac_9199_49bae8794a90.slice/crio-6adc1dac19e44b7e9637648a0d1069ead8037cf46a5d93dd0be3087720992627 WatchSource:0}: Error finding container 6adc1dac19e44b7e9637648a0d1069ead8037cf46a5d93dd0be3087720992627: Status 404 returned error can't find the container with id 6adc1dac19e44b7e9637648a0d1069ead8037cf46a5d93dd0be3087720992627 Nov 22 03:33:36 crc kubenswrapper[4922]: I1122 03:33:36.270530 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" event={"ID":"1054bc07-486a-48ac-9199-49bae8794a90","Type":"ContainerStarted","Data":"4be21c142d9c8df51ee1c5f3792431418041b130e91ec388a77dcc21c7ea6782"} Nov 22 03:33:36 crc kubenswrapper[4922]: I1122 03:33:36.271043 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" event={"ID":"1054bc07-486a-48ac-9199-49bae8794a90","Type":"ContainerStarted","Data":"6adc1dac19e44b7e9637648a0d1069ead8037cf46a5d93dd0be3087720992627"} Nov 22 03:33:36 crc kubenswrapper[4922]: I1122 03:33:36.298082 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" podStartSLOduration=1.878382721 podStartE2EDuration="2.298056765s" podCreationTimestamp="2025-11-22 03:33:34 +0000 UTC" firstStartedPulling="2025-11-22 03:33:35.350485119 +0000 UTC m=+2451.389007011" lastFinishedPulling="2025-11-22 03:33:35.770159133 +0000 UTC m=+2451.808681055" observedRunningTime="2025-11-22 03:33:36.291630778 +0000 UTC m=+2452.330152700" watchObservedRunningTime="2025-11-22 03:33:36.298056765 +0000 UTC m=+2452.336578697" Nov 22 03:33:41 crc kubenswrapper[4922]: I1122 03:33:41.302972 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:33:41 crc kubenswrapper[4922]: E1122 03:33:41.305134 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:33:56 crc kubenswrapper[4922]: I1122 03:33:56.300990 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:33:56 crc kubenswrapper[4922]: E1122 03:33:56.301864 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:34:08 crc kubenswrapper[4922]: I1122 03:34:08.301127 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:34:08 crc kubenswrapper[4922]: E1122 03:34:08.302161 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:34:20 crc kubenswrapper[4922]: I1122 03:34:20.300883 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:34:20 crc kubenswrapper[4922]: I1122 03:34:20.768421 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"338fcd43d92290b41bef050a341da63f101acb3dd1fb99c79edfa548fb79bd0b"} Nov 22 03:34:38 crc kubenswrapper[4922]: I1122 03:34:38.978081 4922 generic.go:334] "Generic (PLEG): container finished" podID="1054bc07-486a-48ac-9199-49bae8794a90" containerID="4be21c142d9c8df51ee1c5f3792431418041b130e91ec388a77dcc21c7ea6782" exitCode=0 Nov 22 03:34:38 crc kubenswrapper[4922]: I1122 03:34:38.978156 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" event={"ID":"1054bc07-486a-48ac-9199-49bae8794a90","Type":"ContainerDied","Data":"4be21c142d9c8df51ee1c5f3792431418041b130e91ec388a77dcc21c7ea6782"} Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.421668 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.441692 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-metadata-combined-ca-bundle\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.441929 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ceph\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.441965 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.441992 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ssh-key\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.442021 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-inventory\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.442090 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldkjh\" (UniqueName: \"kubernetes.io/projected/1054bc07-486a-48ac-9199-49bae8794a90-kube-api-access-ldkjh\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.442117 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-nova-metadata-neutron-config-0\") pod \"1054bc07-486a-48ac-9199-49bae8794a90\" (UID: \"1054bc07-486a-48ac-9199-49bae8794a90\") " Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.481599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1054bc07-486a-48ac-9199-49bae8794a90-kube-api-access-ldkjh" (OuterVolumeSpecName: "kube-api-access-ldkjh") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "kube-api-access-ldkjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.482549 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.483258 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ceph" (OuterVolumeSpecName: "ceph") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.503778 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.509273 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.518166 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-inventory" (OuterVolumeSpecName: "inventory") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.536106 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1054bc07-486a-48ac-9199-49bae8794a90" (UID: "1054bc07-486a-48ac-9199-49bae8794a90"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544123 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544158 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544175 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544187 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544198 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldkjh\" (UniqueName: \"kubernetes.io/projected/1054bc07-486a-48ac-9199-49bae8794a90-kube-api-access-ldkjh\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544212 4922 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:40 crc kubenswrapper[4922]: I1122 03:34:40.544223 4922 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1054bc07-486a-48ac-9199-49bae8794a90-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.003242 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" event={"ID":"1054bc07-486a-48ac-9199-49bae8794a90","Type":"ContainerDied","Data":"6adc1dac19e44b7e9637648a0d1069ead8037cf46a5d93dd0be3087720992627"} Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.003297 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6adc1dac19e44b7e9637648a0d1069ead8037cf46a5d93dd0be3087720992627" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.003946 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.128426 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc"] Nov 22 03:34:41 crc kubenswrapper[4922]: E1122 03:34:41.130134 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1054bc07-486a-48ac-9199-49bae8794a90" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.130172 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1054bc07-486a-48ac-9199-49bae8794a90" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.130534 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1054bc07-486a-48ac-9199-49bae8794a90" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.131555 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.133880 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.134240 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.134402 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.134411 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.134576 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.134932 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.151579 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc"] Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.153901 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.154028 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.154588 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.154662 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.154802 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.154877 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc665\" (UniqueName: \"kubernetes.io/projected/f3e6467e-b9e0-4e3f-a718-244e44628def-kube-api-access-hc665\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.256723 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.256786 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc665\" (UniqueName: \"kubernetes.io/projected/f3e6467e-b9e0-4e3f-a718-244e44628def-kube-api-access-hc665\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.256819 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.256910 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.257566 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.257596 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.262837 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.264181 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.264449 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.264730 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.265612 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.292084 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc665\" (UniqueName: \"kubernetes.io/projected/f3e6467e-b9e0-4e3f-a718-244e44628def-kube-api-access-hc665\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:41 crc kubenswrapper[4922]: I1122 03:34:41.454878 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:34:42 crc kubenswrapper[4922]: I1122 03:34:42.103994 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc"] Nov 22 03:34:43 crc kubenswrapper[4922]: I1122 03:34:43.028196 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" event={"ID":"f3e6467e-b9e0-4e3f-a718-244e44628def","Type":"ContainerStarted","Data":"4303924e410777b4083b778317a335ea1997951910529c2bfa804782864bee1b"} Nov 22 03:34:43 crc kubenswrapper[4922]: I1122 03:34:43.028641 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" event={"ID":"f3e6467e-b9e0-4e3f-a718-244e44628def","Type":"ContainerStarted","Data":"57db923d26344416120b6cc00f5d20f63163bb50d4a8874c92d3bba62215496f"} Nov 22 03:34:43 crc kubenswrapper[4922]: I1122 03:34:43.062443 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" podStartSLOduration=1.6498677019999999 podStartE2EDuration="2.062418576s" podCreationTimestamp="2025-11-22 03:34:41 +0000 UTC" firstStartedPulling="2025-11-22 03:34:42.109620221 +0000 UTC m=+2518.148142123" lastFinishedPulling="2025-11-22 03:34:42.522171075 +0000 UTC m=+2518.560692997" observedRunningTime="2025-11-22 03:34:43.053461229 +0000 UTC m=+2519.091983151" watchObservedRunningTime="2025-11-22 03:34:43.062418576 +0000 UTC m=+2519.100940478" Nov 22 03:36:41 crc kubenswrapper[4922]: I1122 03:36:41.110604 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:36:41 crc kubenswrapper[4922]: I1122 03:36:41.111369 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:37:11 crc kubenswrapper[4922]: I1122 03:37:11.109260 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:37:11 crc kubenswrapper[4922]: I1122 03:37:11.109965 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:37:41 crc kubenswrapper[4922]: I1122 03:37:41.110047 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:37:41 crc kubenswrapper[4922]: I1122 03:37:41.110745 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:37:41 crc kubenswrapper[4922]: I1122 03:37:41.110813 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:37:41 crc kubenswrapper[4922]: I1122 03:37:41.111785 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"338fcd43d92290b41bef050a341da63f101acb3dd1fb99c79edfa548fb79bd0b"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:37:41 crc kubenswrapper[4922]: I1122 03:37:41.111994 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://338fcd43d92290b41bef050a341da63f101acb3dd1fb99c79edfa548fb79bd0b" gracePeriod=600 Nov 22 03:37:42 crc kubenswrapper[4922]: I1122 03:37:42.218146 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="338fcd43d92290b41bef050a341da63f101acb3dd1fb99c79edfa548fb79bd0b" exitCode=0 Nov 22 03:37:42 crc kubenswrapper[4922]: I1122 03:37:42.218229 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"338fcd43d92290b41bef050a341da63f101acb3dd1fb99c79edfa548fb79bd0b"} Nov 22 03:37:42 crc kubenswrapper[4922]: I1122 03:37:42.218509 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc"} Nov 22 03:37:42 crc kubenswrapper[4922]: I1122 03:37:42.218539 4922 scope.go:117] "RemoveContainer" containerID="9a3eb4d79c66bae819bf80ebc7311991f8b0b1ea20e5bf891e55d01a3bffaa78" Nov 22 03:39:15 crc kubenswrapper[4922]: I1122 03:39:15.203958 4922 generic.go:334] "Generic (PLEG): container finished" podID="f3e6467e-b9e0-4e3f-a718-244e44628def" containerID="4303924e410777b4083b778317a335ea1997951910529c2bfa804782864bee1b" exitCode=0 Nov 22 03:39:15 crc kubenswrapper[4922]: I1122 03:39:15.204050 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" event={"ID":"f3e6467e-b9e0-4e3f-a718-244e44628def","Type":"ContainerDied","Data":"4303924e410777b4083b778317a335ea1997951910529c2bfa804782864bee1b"} Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.660122 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.834443 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-combined-ca-bundle\") pod \"f3e6467e-b9e0-4e3f-a718-244e44628def\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.834904 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ceph\") pod \"f3e6467e-b9e0-4e3f-a718-244e44628def\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.834963 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-inventory\") pod \"f3e6467e-b9e0-4e3f-a718-244e44628def\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.835033 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ssh-key\") pod \"f3e6467e-b9e0-4e3f-a718-244e44628def\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.835087 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-secret-0\") pod \"f3e6467e-b9e0-4e3f-a718-244e44628def\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.835736 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc665\" (UniqueName: \"kubernetes.io/projected/f3e6467e-b9e0-4e3f-a718-244e44628def-kube-api-access-hc665\") pod \"f3e6467e-b9e0-4e3f-a718-244e44628def\" (UID: \"f3e6467e-b9e0-4e3f-a718-244e44628def\") " Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.842451 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f3e6467e-b9e0-4e3f-a718-244e44628def" (UID: "f3e6467e-b9e0-4e3f-a718-244e44628def"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.844061 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ceph" (OuterVolumeSpecName: "ceph") pod "f3e6467e-b9e0-4e3f-a718-244e44628def" (UID: "f3e6467e-b9e0-4e3f-a718-244e44628def"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.844821 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e6467e-b9e0-4e3f-a718-244e44628def-kube-api-access-hc665" (OuterVolumeSpecName: "kube-api-access-hc665") pod "f3e6467e-b9e0-4e3f-a718-244e44628def" (UID: "f3e6467e-b9e0-4e3f-a718-244e44628def"). InnerVolumeSpecName "kube-api-access-hc665". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.865860 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3e6467e-b9e0-4e3f-a718-244e44628def" (UID: "f3e6467e-b9e0-4e3f-a718-244e44628def"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.885582 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-inventory" (OuterVolumeSpecName: "inventory") pod "f3e6467e-b9e0-4e3f-a718-244e44628def" (UID: "f3e6467e-b9e0-4e3f-a718-244e44628def"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.887185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f3e6467e-b9e0-4e3f-a718-244e44628def" (UID: "f3e6467e-b9e0-4e3f-a718-244e44628def"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.938240 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.938274 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.938289 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.938298 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.938312 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc665\" (UniqueName: \"kubernetes.io/projected/f3e6467e-b9e0-4e3f-a718-244e44628def-kube-api-access-hc665\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:16 crc kubenswrapper[4922]: I1122 03:39:16.938325 4922 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e6467e-b9e0-4e3f-a718-244e44628def-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.227059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" event={"ID":"f3e6467e-b9e0-4e3f-a718-244e44628def","Type":"ContainerDied","Data":"57db923d26344416120b6cc00f5d20f63163bb50d4a8874c92d3bba62215496f"} Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.227105 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57db923d26344416120b6cc00f5d20f63163bb50d4a8874c92d3bba62215496f" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.227167 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.331452 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg"] Nov 22 03:39:17 crc kubenswrapper[4922]: E1122 03:39:17.331920 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e6467e-b9e0-4e3f-a718-244e44628def" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.331941 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e6467e-b9e0-4e3f-a718-244e44628def" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.332181 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e6467e-b9e0-4e3f-a718-244e44628def" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.332829 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.334818 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.338136 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.338524 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.339048 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.339308 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.339577 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.339813 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wzhgr" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.341957 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.347932 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.357041 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg"] Nov 22 03:39:17 crc kubenswrapper[4922]: E1122 03:39:17.423590 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3e6467e_b9e0_4e3f_a718_244e44628def.slice\": RecentStats: unable to find data in memory cache]" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449244 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2kgg\" (UniqueName: \"kubernetes.io/projected/451dc20b-5cce-4f72-821b-f08403bed351-kube-api-access-m2kgg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449770 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449817 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449873 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449952 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.449977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.450009 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.450051 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.450100 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.450145 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551442 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551507 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551542 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551578 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551606 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551641 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551668 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551694 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551723 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551768 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2kgg\" (UniqueName: \"kubernetes.io/projected/451dc20b-5cce-4f72-821b-f08403bed351-kube-api-access-m2kgg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.551874 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.552910 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.553022 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.558083 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.558309 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.558769 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.559272 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.559675 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.561548 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.564820 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.566270 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.572726 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2kgg\" (UniqueName: \"kubernetes.io/projected/451dc20b-5cce-4f72-821b-f08403bed351-kube-api-access-m2kgg\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:17 crc kubenswrapper[4922]: I1122 03:39:17.708064 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:39:18 crc kubenswrapper[4922]: W1122 03:39:18.237079 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451dc20b_5cce_4f72_821b_f08403bed351.slice/crio-9121746db3850ce2fd015ae5ae08f72a4f1fc6c3c9555f97a8613e627e0ad36e WatchSource:0}: Error finding container 9121746db3850ce2fd015ae5ae08f72a4f1fc6c3c9555f97a8613e627e0ad36e: Status 404 returned error can't find the container with id 9121746db3850ce2fd015ae5ae08f72a4f1fc6c3c9555f97a8613e627e0ad36e Nov 22 03:39:18 crc kubenswrapper[4922]: I1122 03:39:18.241557 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg"] Nov 22 03:39:18 crc kubenswrapper[4922]: I1122 03:39:18.244183 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:39:19 crc kubenswrapper[4922]: I1122 03:39:19.251983 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" event={"ID":"451dc20b-5cce-4f72-821b-f08403bed351","Type":"ContainerStarted","Data":"a88f7c7d031db3f6b2fa4e66d41191dc8609c7e6461a475edd5bd61e7cd8cf0c"} Nov 22 03:39:19 crc kubenswrapper[4922]: I1122 03:39:19.252328 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" event={"ID":"451dc20b-5cce-4f72-821b-f08403bed351","Type":"ContainerStarted","Data":"9121746db3850ce2fd015ae5ae08f72a4f1fc6c3c9555f97a8613e627e0ad36e"} Nov 22 03:39:19 crc kubenswrapper[4922]: I1122 03:39:19.282644 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" podStartSLOduration=1.643777385 podStartE2EDuration="2.282621317s" podCreationTimestamp="2025-11-22 03:39:17 +0000 UTC" firstStartedPulling="2025-11-22 03:39:18.243729594 +0000 UTC m=+2794.282251486" lastFinishedPulling="2025-11-22 03:39:18.882573526 +0000 UTC m=+2794.921095418" observedRunningTime="2025-11-22 03:39:19.274772577 +0000 UTC m=+2795.313294479" watchObservedRunningTime="2025-11-22 03:39:19.282621317 +0000 UTC m=+2795.321143229" Nov 22 03:39:41 crc kubenswrapper[4922]: I1122 03:39:41.109367 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:39:41 crc kubenswrapper[4922]: I1122 03:39:41.111153 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:40:11 crc kubenswrapper[4922]: I1122 03:40:11.109386 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:40:11 crc kubenswrapper[4922]: I1122 03:40:11.110071 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:40:41 crc kubenswrapper[4922]: I1122 03:40:41.109989 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:40:41 crc kubenswrapper[4922]: I1122 03:40:41.110656 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:40:41 crc kubenswrapper[4922]: I1122 03:40:41.110747 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:40:41 crc kubenswrapper[4922]: I1122 03:40:41.111795 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:40:41 crc kubenswrapper[4922]: I1122 03:40:41.111961 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" gracePeriod=600 Nov 22 03:40:41 crc kubenswrapper[4922]: E1122 03:40:41.242157 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:40:42 crc kubenswrapper[4922]: I1122 03:40:42.135081 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" exitCode=0 Nov 22 03:40:42 crc kubenswrapper[4922]: I1122 03:40:42.135155 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc"} Nov 22 03:40:42 crc kubenswrapper[4922]: I1122 03:40:42.135668 4922 scope.go:117] "RemoveContainer" containerID="338fcd43d92290b41bef050a341da63f101acb3dd1fb99c79edfa548fb79bd0b" Nov 22 03:40:42 crc kubenswrapper[4922]: I1122 03:40:42.136996 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:40:42 crc kubenswrapper[4922]: E1122 03:40:42.137584 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:40:55 crc kubenswrapper[4922]: I1122 03:40:55.313747 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:40:55 crc kubenswrapper[4922]: E1122 03:40:55.315197 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:41:06 crc kubenswrapper[4922]: I1122 03:41:06.301580 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:41:06 crc kubenswrapper[4922]: E1122 03:41:06.304890 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:41:20 crc kubenswrapper[4922]: I1122 03:41:20.300474 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:41:20 crc kubenswrapper[4922]: E1122 03:41:20.301572 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:41:33 crc kubenswrapper[4922]: I1122 03:41:33.301327 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:41:33 crc kubenswrapper[4922]: E1122 03:41:33.303386 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:41:45 crc kubenswrapper[4922]: I1122 03:41:45.311911 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:41:45 crc kubenswrapper[4922]: E1122 03:41:45.312809 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.256943 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b9s4h"] Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.261422 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.323549 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9s4h"] Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.413124 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-catalog-content\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.413265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7q7f\" (UniqueName: \"kubernetes.io/projected/015080db-0f7e-4145-a70f-f479abafb674-kube-api-access-b7q7f\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.413694 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-utilities\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.514815 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-catalog-content\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.514869 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7q7f\" (UniqueName: \"kubernetes.io/projected/015080db-0f7e-4145-a70f-f479abafb674-kube-api-access-b7q7f\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.514958 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-utilities\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.515568 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-utilities\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.515576 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-catalog-content\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.540012 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7q7f\" (UniqueName: \"kubernetes.io/projected/015080db-0f7e-4145-a70f-f479abafb674-kube-api-access-b7q7f\") pod \"community-operators-b9s4h\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:48 crc kubenswrapper[4922]: I1122 03:41:48.647009 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:49 crc kubenswrapper[4922]: I1122 03:41:49.169864 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b9s4h"] Nov 22 03:41:49 crc kubenswrapper[4922]: I1122 03:41:49.864313 4922 generic.go:334] "Generic (PLEG): container finished" podID="015080db-0f7e-4145-a70f-f479abafb674" containerID="b58ae2adeec2759ff9ac634d556f92e74c00074535cfac8e428037131b2fe970" exitCode=0 Nov 22 03:41:49 crc kubenswrapper[4922]: I1122 03:41:49.864376 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9s4h" event={"ID":"015080db-0f7e-4145-a70f-f479abafb674","Type":"ContainerDied","Data":"b58ae2adeec2759ff9ac634d556f92e74c00074535cfac8e428037131b2fe970"} Nov 22 03:41:49 crc kubenswrapper[4922]: I1122 03:41:49.864597 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9s4h" event={"ID":"015080db-0f7e-4145-a70f-f479abafb674","Type":"ContainerStarted","Data":"11d71cf5ba60f6f2c61107e1660215a1cdd221e79bf2a0f330f3e3d25668b777"} Nov 22 03:41:51 crc kubenswrapper[4922]: I1122 03:41:51.891148 4922 generic.go:334] "Generic (PLEG): container finished" podID="015080db-0f7e-4145-a70f-f479abafb674" containerID="d17a013062b65b71ca65905caeabdf9630b11f66837abd7ae7aa2f7ff4991f57" exitCode=0 Nov 22 03:41:51 crc kubenswrapper[4922]: I1122 03:41:51.891249 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9s4h" event={"ID":"015080db-0f7e-4145-a70f-f479abafb674","Type":"ContainerDied","Data":"d17a013062b65b71ca65905caeabdf9630b11f66837abd7ae7aa2f7ff4991f57"} Nov 22 03:41:53 crc kubenswrapper[4922]: I1122 03:41:53.916561 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9s4h" event={"ID":"015080db-0f7e-4145-a70f-f479abafb674","Type":"ContainerStarted","Data":"e4172101e6df99d08c4768b3c771214ed1362d1a4e040d9a25cc6d5a6358e886"} Nov 22 03:41:53 crc kubenswrapper[4922]: I1122 03:41:53.940436 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b9s4h" podStartSLOduration=2.985818117 podStartE2EDuration="5.940417336s" podCreationTimestamp="2025-11-22 03:41:48 +0000 UTC" firstStartedPulling="2025-11-22 03:41:49.866542236 +0000 UTC m=+2945.905064148" lastFinishedPulling="2025-11-22 03:41:52.821141455 +0000 UTC m=+2948.859663367" observedRunningTime="2025-11-22 03:41:53.932572577 +0000 UTC m=+2949.971094489" watchObservedRunningTime="2025-11-22 03:41:53.940417336 +0000 UTC m=+2949.978939228" Nov 22 03:41:58 crc kubenswrapper[4922]: I1122 03:41:58.647317 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:58 crc kubenswrapper[4922]: I1122 03:41:58.647858 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:58 crc kubenswrapper[4922]: I1122 03:41:58.731823 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:59 crc kubenswrapper[4922]: I1122 03:41:59.014677 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:41:59 crc kubenswrapper[4922]: I1122 03:41:59.076346 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9s4h"] Nov 22 03:41:59 crc kubenswrapper[4922]: I1122 03:41:59.300519 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:41:59 crc kubenswrapper[4922]: E1122 03:41:59.300811 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:42:00 crc kubenswrapper[4922]: I1122 03:42:00.984447 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b9s4h" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="registry-server" containerID="cri-o://e4172101e6df99d08c4768b3c771214ed1362d1a4e040d9a25cc6d5a6358e886" gracePeriod=2 Nov 22 03:42:01 crc kubenswrapper[4922]: I1122 03:42:01.993974 4922 generic.go:334] "Generic (PLEG): container finished" podID="015080db-0f7e-4145-a70f-f479abafb674" containerID="e4172101e6df99d08c4768b3c771214ed1362d1a4e040d9a25cc6d5a6358e886" exitCode=0 Nov 22 03:42:01 crc kubenswrapper[4922]: I1122 03:42:01.994030 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9s4h" event={"ID":"015080db-0f7e-4145-a70f-f479abafb674","Type":"ContainerDied","Data":"e4172101e6df99d08c4768b3c771214ed1362d1a4e040d9a25cc6d5a6358e886"} Nov 22 03:42:01 crc kubenswrapper[4922]: I1122 03:42:01.994059 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b9s4h" event={"ID":"015080db-0f7e-4145-a70f-f479abafb674","Type":"ContainerDied","Data":"11d71cf5ba60f6f2c61107e1660215a1cdd221e79bf2a0f330f3e3d25668b777"} Nov 22 03:42:01 crc kubenswrapper[4922]: I1122 03:42:01.994072 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11d71cf5ba60f6f2c61107e1660215a1cdd221e79bf2a0f330f3e3d25668b777" Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.050958 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.123948 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7q7f\" (UniqueName: \"kubernetes.io/projected/015080db-0f7e-4145-a70f-f479abafb674-kube-api-access-b7q7f\") pod \"015080db-0f7e-4145-a70f-f479abafb674\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.124285 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-catalog-content\") pod \"015080db-0f7e-4145-a70f-f479abafb674\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.124400 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-utilities\") pod \"015080db-0f7e-4145-a70f-f479abafb674\" (UID: \"015080db-0f7e-4145-a70f-f479abafb674\") " Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.125819 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-utilities" (OuterVolumeSpecName: "utilities") pod "015080db-0f7e-4145-a70f-f479abafb674" (UID: "015080db-0f7e-4145-a70f-f479abafb674"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.129999 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015080db-0f7e-4145-a70f-f479abafb674-kube-api-access-b7q7f" (OuterVolumeSpecName: "kube-api-access-b7q7f") pod "015080db-0f7e-4145-a70f-f479abafb674" (UID: "015080db-0f7e-4145-a70f-f479abafb674"). InnerVolumeSpecName "kube-api-access-b7q7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.226673 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7q7f\" (UniqueName: \"kubernetes.io/projected/015080db-0f7e-4145-a70f-f479abafb674-kube-api-access-b7q7f\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:02 crc kubenswrapper[4922]: I1122 03:42:02.226724 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:03 crc kubenswrapper[4922]: I1122 03:42:03.005984 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b9s4h" Nov 22 03:42:03 crc kubenswrapper[4922]: I1122 03:42:03.006787 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "015080db-0f7e-4145-a70f-f479abafb674" (UID: "015080db-0f7e-4145-a70f-f479abafb674"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:42:03 crc kubenswrapper[4922]: I1122 03:42:03.043473 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015080db-0f7e-4145-a70f-f479abafb674-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:03 crc kubenswrapper[4922]: I1122 03:42:03.359510 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b9s4h"] Nov 22 03:42:03 crc kubenswrapper[4922]: I1122 03:42:03.365392 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b9s4h"] Nov 22 03:42:05 crc kubenswrapper[4922]: I1122 03:42:05.315510 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015080db-0f7e-4145-a70f-f479abafb674" path="/var/lib/kubelet/pods/015080db-0f7e-4145-a70f-f479abafb674/volumes" Nov 22 03:42:10 crc kubenswrapper[4922]: I1122 03:42:10.302211 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:42:10 crc kubenswrapper[4922]: E1122 03:42:10.303300 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:42:23 crc kubenswrapper[4922]: I1122 03:42:23.301049 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:42:23 crc kubenswrapper[4922]: E1122 03:42:23.301912 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:42:32 crc kubenswrapper[4922]: I1122 03:42:32.332813 4922 generic.go:334] "Generic (PLEG): container finished" podID="451dc20b-5cce-4f72-821b-f08403bed351" containerID="a88f7c7d031db3f6b2fa4e66d41191dc8609c7e6461a475edd5bd61e7cd8cf0c" exitCode=0 Nov 22 03:42:32 crc kubenswrapper[4922]: I1122 03:42:32.332879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" event={"ID":"451dc20b-5cce-4f72-821b-f08403bed351","Type":"ContainerDied","Data":"a88f7c7d031db3f6b2fa4e66d41191dc8609c7e6461a475edd5bd61e7cd8cf0c"} Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.822760 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.968960 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-inventory\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.969305 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-0\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.969440 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-ceph-nova-0\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.969551 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-nova-extra-config-0\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.969676 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2kgg\" (UniqueName: \"kubernetes.io/projected/451dc20b-5cce-4f72-821b-f08403bed351-kube-api-access-m2kgg\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.969787 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-custom-ceph-combined-ca-bundle\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.970054 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-1\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.970293 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ceph\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.970428 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-1\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.970542 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ssh-key\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.970668 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-0\") pod \"451dc20b-5cce-4f72-821b-f08403bed351\" (UID: \"451dc20b-5cce-4f72-821b-f08403bed351\") " Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.975943 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.976155 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451dc20b-5cce-4f72-821b-f08403bed351-kube-api-access-m2kgg" (OuterVolumeSpecName: "kube-api-access-m2kgg") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "kube-api-access-m2kgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.980939 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ceph" (OuterVolumeSpecName: "ceph") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:33 crc kubenswrapper[4922]: I1122 03:42:33.997105 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.001613 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.002641 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-inventory" (OuterVolumeSpecName: "inventory") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.013142 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.013928 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.014517 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.018185 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.029813 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "451dc20b-5cce-4f72-821b-f08403bed351" (UID: "451dc20b-5cce-4f72-821b-f08403bed351"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073587 4922 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073623 4922 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073633 4922 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073641 4922 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/451dc20b-5cce-4f72-821b-f08403bed351-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073650 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2kgg\" (UniqueName: \"kubernetes.io/projected/451dc20b-5cce-4f72-821b-f08403bed351-kube-api-access-m2kgg\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073661 4922 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073672 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073681 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073689 4922 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073696 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.073704 4922 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/451dc20b-5cce-4f72-821b-f08403bed351-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.354903 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" event={"ID":"451dc20b-5cce-4f72-821b-f08403bed351","Type":"ContainerDied","Data":"9121746db3850ce2fd015ae5ae08f72a4f1fc6c3c9555f97a8613e627e0ad36e"} Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.355500 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9121746db3850ce2fd015ae5ae08f72a4f1fc6c3c9555f97a8613e627e0ad36e" Nov 22 03:42:34 crc kubenswrapper[4922]: I1122 03:42:34.355008 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg" Nov 22 03:42:35 crc kubenswrapper[4922]: I1122 03:42:35.308594 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:42:35 crc kubenswrapper[4922]: E1122 03:42:35.310479 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:42:46 crc kubenswrapper[4922]: I1122 03:42:46.301560 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:42:46 crc kubenswrapper[4922]: E1122 03:42:46.302739 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.592158 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 22 03:42:49 crc kubenswrapper[4922]: E1122 03:42:49.593004 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="registry-server" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.593022 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="registry-server" Nov 22 03:42:49 crc kubenswrapper[4922]: E1122 03:42:49.593051 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451dc20b-5cce-4f72-821b-f08403bed351" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.593061 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="451dc20b-5cce-4f72-821b-f08403bed351" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 22 03:42:49 crc kubenswrapper[4922]: E1122 03:42:49.593082 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="extract-content" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.593090 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="extract-content" Nov 22 03:42:49 crc kubenswrapper[4922]: E1122 03:42:49.593123 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="extract-utilities" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.593130 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="extract-utilities" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.593337 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="451dc20b-5cce-4f72-821b-f08403bed351" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.593370 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="015080db-0f7e-4145-a70f-f479abafb674" containerName="registry-server" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.594612 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.596549 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.598639 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.600083 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.602089 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.602093 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.619625 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.628249 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685151 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685202 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685223 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685267 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685298 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-dev\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685334 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-ceph\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685358 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-lib-modules\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685373 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/743b2d95-20b8-4677-a1fd-6a5eb808628d-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685393 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685409 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685539 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685579 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-run\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685779 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685870 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685916 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685935 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.685960 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686011 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686038 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-sys\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686081 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686115 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-run\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686137 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp2bc\" (UniqueName: \"kubernetes.io/projected/743b2d95-20b8-4677-a1fd-6a5eb808628d-kube-api-access-vp2bc\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686160 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686263 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686307 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-sys\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686355 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-config-data\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686401 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686454 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvgc\" (UniqueName: \"kubernetes.io/projected/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-kube-api-access-snvgc\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686488 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-scripts\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686534 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-dev\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.686570 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.787933 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.787978 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788008 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788040 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788061 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-sys\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788058 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788089 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788114 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788122 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-run\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788164 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-run\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788164 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp2bc\" (UniqueName: \"kubernetes.io/projected/743b2d95-20b8-4677-a1fd-6a5eb808628d-kube-api-access-vp2bc\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788195 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-sys\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788212 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788255 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788276 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-sys\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788284 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788352 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-sys\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788343 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788300 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-config-data\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788383 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-etc-nvme\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788452 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788552 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvgc\" (UniqueName: \"kubernetes.io/projected/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-kube-api-access-snvgc\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-scripts\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788562 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788762 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-dev\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788809 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788863 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-dev\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788902 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.788953 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789046 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789086 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789154 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-dev\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789201 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-ceph\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789263 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-lib-modules\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789299 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/743b2d95-20b8-4677-a1fd-6a5eb808628d-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789350 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789389 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789443 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789478 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-run\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789551 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789675 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789900 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.789970 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-dev\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.790964 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.791016 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-lib-modules\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.792323 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.793971 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.794025 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/743b2d95-20b8-4677-a1fd-6a5eb808628d-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.794013 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-run\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.794162 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.794414 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/743b2d95-20b8-4677-a1fd-6a5eb808628d-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.795253 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-scripts\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.796423 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.796588 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-ceph\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.797244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.797359 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.799643 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-config-data\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.810203 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.814592 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-config-data-custom\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.821297 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvgc\" (UniqueName: \"kubernetes.io/projected/c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5-kube-api-access-snvgc\") pod \"cinder-backup-0\" (UID: \"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5\") " pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.827678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp2bc\" (UniqueName: \"kubernetes.io/projected/743b2d95-20b8-4677-a1fd-6a5eb808628d-kube-api-access-vp2bc\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.830006 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/743b2d95-20b8-4677-a1fd-6a5eb808628d-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"743b2d95-20b8-4677-a1fd-6a5eb808628d\") " pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.917897 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 22 03:42:49 crc kubenswrapper[4922]: I1122 03:42:49.926405 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.098162 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-gtphg"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.099653 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gtphg" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.120829 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-gtphg"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.198691 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4nj\" (UniqueName: \"kubernetes.io/projected/c570feea-6723-4626-a849-fc67db11ee3e-kube-api-access-6t4nj\") pod \"manila-db-create-gtphg\" (UID: \"c570feea-6723-4626-a849-fc67db11ee3e\") " pod="openstack/manila-db-create-gtphg" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.235085 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-575b764d7f-w8r6x"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.236428 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.248263 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-crsx8" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.248478 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.249036 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.249271 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.266354 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575b764d7f-w8r6x"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.301265 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8cf55ca-4cba-4931-a9e6-021fa6e53669-horizon-secret-key\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.301317 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-config-data\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.301346 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x9fj\" (UniqueName: \"kubernetes.io/projected/d8cf55ca-4cba-4931-a9e6-021fa6e53669-kube-api-access-5x9fj\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.301378 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cf55ca-4cba-4931-a9e6-021fa6e53669-logs\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.301412 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-scripts\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.301461 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4nj\" (UniqueName: \"kubernetes.io/projected/c570feea-6723-4626-a849-fc67db11ee3e-kube-api-access-6t4nj\") pod \"manila-db-create-gtphg\" (UID: \"c570feea-6723-4626-a849-fc67db11ee3e\") " pod="openstack/manila-db-create-gtphg" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.323588 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.327899 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.331488 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.331678 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mw8pk" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.332237 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.332442 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.332561 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4nj\" (UniqueName: \"kubernetes.io/projected/c570feea-6723-4626-a849-fc67db11ee3e-kube-api-access-6t4nj\") pod \"manila-db-create-gtphg\" (UID: \"c570feea-6723-4626-a849-fc67db11ee3e\") " pod="openstack/manila-db-create-gtphg" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.350706 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.406543 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8cf55ca-4cba-4931-a9e6-021fa6e53669-horizon-secret-key\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.406656 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-config-data\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.406752 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.406812 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x9fj\" (UniqueName: \"kubernetes.io/projected/d8cf55ca-4cba-4931-a9e6-021fa6e53669-kube-api-access-5x9fj\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407010 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cf55ca-4cba-4931-a9e6-021fa6e53669-logs\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407146 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407239 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407301 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdcf\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-kube-api-access-jbdcf\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407355 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-scripts\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407423 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407520 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407577 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407644 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.407776 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.424699 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8cf55ca-4cba-4931-a9e6-021fa6e53669-horizon-secret-key\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.425310 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-scripts\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.425501 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-config-data\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.425758 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cf55ca-4cba-4931-a9e6-021fa6e53669-logs\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.434639 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gtphg" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.454332 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x9fj\" (UniqueName: \"kubernetes.io/projected/d8cf55ca-4cba-4931-a9e6-021fa6e53669-kube-api-access-5x9fj\") pod \"horizon-575b764d7f-w8r6x\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.467186 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b987c6977-c7m5n"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.494448 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b987c6977-c7m5n"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.494777 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.508312 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.510144 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.517816 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.518000 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519582 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519623 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519667 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519721 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519761 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519780 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519795 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdcf\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-kube-api-access-jbdcf\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519820 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.519863 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.520187 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.521591 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.522229 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.525325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.528783 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.529418 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.530539 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.532296 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.534032 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.564325 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.564893 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdcf\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-kube-api-access-jbdcf\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.574312 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622212 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622323 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-logs\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622422 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-config-data\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622449 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622473 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4287b67-a4f2-4574-b73f-2995595f2199-logs\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622532 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-config-data\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622727 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9zsn\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-kube-api-access-j9zsn\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622799 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-ceph\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.622894 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.623258 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx67b\" (UniqueName: \"kubernetes.io/projected/b4287b67-a4f2-4574-b73f-2995595f2199-kube-api-access-tx67b\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.623324 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-scripts\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.623367 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-scripts\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.623453 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4287b67-a4f2-4574-b73f-2995595f2199-horizon-secret-key\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.634981 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.648782 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: W1122 03:42:50.724571 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod743b2d95_20b8_4677_a1fd_6a5eb808628d.slice/crio-3309b080fdc6d9763dc18ce9155698e31e8736101bc2f8d44716ea28264080a5 WatchSource:0}: Error finding container 3309b080fdc6d9763dc18ce9155698e31e8736101bc2f8d44716ea28264080a5: Status 404 returned error can't find the container with id 3309b080fdc6d9763dc18ce9155698e31e8736101bc2f8d44716ea28264080a5 Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725450 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725508 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-logs\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725562 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-config-data\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725583 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725598 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725624 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4287b67-a4f2-4574-b73f-2995595f2199-logs\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725639 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-config-data\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725672 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9zsn\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-kube-api-access-j9zsn\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725695 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-ceph\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725711 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725742 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx67b\" (UniqueName: \"kubernetes.io/projected/b4287b67-a4f2-4574-b73f-2995595f2199-kube-api-access-tx67b\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725758 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-scripts\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725780 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-scripts\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.725800 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4287b67-a4f2-4574-b73f-2995595f2199-horizon-secret-key\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.727989 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-config-data\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.728379 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.729259 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-logs\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.729519 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4287b67-a4f2-4574-b73f-2995595f2199-logs\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.730007 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-config-data\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.730121 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-scripts\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.730371 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4287b67-a4f2-4574-b73f-2995595f2199-horizon-secret-key\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.730518 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.731404 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-ceph\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.734678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.736194 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-scripts\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.745788 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.759530 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx67b\" (UniqueName: \"kubernetes.io/projected/b4287b67-a4f2-4574-b73f-2995595f2199-kube-api-access-tx67b\") pod \"horizon-7b987c6977-c7m5n\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.759831 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.759998 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9zsn\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-kube-api-access-j9zsn\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.772542 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.823386 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.857426 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:42:50 crc kubenswrapper[4922]: I1122 03:42:50.954868 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-gtphg"] Nov 22 03:42:50 crc kubenswrapper[4922]: W1122 03:42:50.975581 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc570feea_6723_4626_a849_fc67db11ee3e.slice/crio-b9bf29218ecfb237aef1231e4e7903234dff430ff52f38fa85d0ba510df039dc WatchSource:0}: Error finding container b9bf29218ecfb237aef1231e4e7903234dff430ff52f38fa85d0ba510df039dc: Status 404 returned error can't find the container with id b9bf29218ecfb237aef1231e4e7903234dff430ff52f38fa85d0ba510df039dc Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.032990 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575b764d7f-w8r6x"] Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.324928 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.338860 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b987c6977-c7m5n"] Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.542112 4922 generic.go:334] "Generic (PLEG): container finished" podID="c570feea-6723-4626-a849-fc67db11ee3e" containerID="55bb4c3b28ff64699506229b173e8dd86d6b8f70b8799297ce860d83a5152a1f" exitCode=0 Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.542412 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gtphg" event={"ID":"c570feea-6723-4626-a849-fc67db11ee3e","Type":"ContainerDied","Data":"55bb4c3b28ff64699506229b173e8dd86d6b8f70b8799297ce860d83a5152a1f"} Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.542440 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gtphg" event={"ID":"c570feea-6723-4626-a849-fc67db11ee3e","Type":"ContainerStarted","Data":"b9bf29218ecfb237aef1231e4e7903234dff430ff52f38fa85d0ba510df039dc"} Nov 22 03:42:51 crc kubenswrapper[4922]: W1122 03:42:51.549525 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580b343c_5374_4eec_b414_cbe371f500a3.slice/crio-e40550c9c68d231cbd4c9c2e6dcbb7e1e1ad993751df4499f86dd9c91b4dff07 WatchSource:0}: Error finding container e40550c9c68d231cbd4c9c2e6dcbb7e1e1ad993751df4499f86dd9c91b4dff07: Status 404 returned error can't find the container with id e40550c9c68d231cbd4c9c2e6dcbb7e1e1ad993751df4499f86dd9c91b4dff07 Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.550610 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"743b2d95-20b8-4677-a1fd-6a5eb808628d","Type":"ContainerStarted","Data":"3309b080fdc6d9763dc18ce9155698e31e8736101bc2f8d44716ea28264080a5"} Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.552211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5","Type":"ContainerStarted","Data":"8bbd2f0a5997d4cb93925ea95eb0477487e3af8980f6456cca8b8d96465ab5bf"} Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.552941 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b764d7f-w8r6x" event={"ID":"d8cf55ca-4cba-4931-a9e6-021fa6e53669","Type":"ContainerStarted","Data":"0e52e9f43cc0533b58e6cb6bea646b4112e09796bece1fedf618429dc0aeaecf"} Nov 22 03:42:51 crc kubenswrapper[4922]: I1122 03:42:51.554721 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.571153 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b987c6977-c7m5n" event={"ID":"b4287b67-a4f2-4574-b73f-2995595f2199","Type":"ContainerStarted","Data":"d248a6376a0dfd87676f0631402d9f0674674bd3f616c74ce20dbfd89e6839d6"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.573428 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"580b343c-5374-4eec-b414-cbe371f500a3","Type":"ContainerStarted","Data":"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.573463 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"580b343c-5374-4eec-b414-cbe371f500a3","Type":"ContainerStarted","Data":"e40550c9c68d231cbd4c9c2e6dcbb7e1e1ad993751df4499f86dd9c91b4dff07"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.577284 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"743b2d95-20b8-4677-a1fd-6a5eb808628d","Type":"ContainerStarted","Data":"c51b334c181b30faa42a0fa49eb2675fb6c020d7c51fb617ebb5370121d914c2"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.577308 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"743b2d95-20b8-4677-a1fd-6a5eb808628d","Type":"ContainerStarted","Data":"a01ade48016729d16236ac0968885dec1e05d0dd6eb046c5149ddd140429dec3"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.584352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5","Type":"ContainerStarted","Data":"0e39a7a8219afaac8886140ddbd9c12dab9829775310931a12b4d8dda27d0543"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.584385 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5","Type":"ContainerStarted","Data":"4f064fe74e0b94946f450e96abc952f18c908b4b545e74c95e4f9871ce721024"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.628471 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.512726014 podStartE2EDuration="3.628447828s" podCreationTimestamp="2025-11-22 03:42:49 +0000 UTC" firstStartedPulling="2025-11-22 03:42:50.752812655 +0000 UTC m=+3006.791334547" lastFinishedPulling="2025-11-22 03:42:51.868534459 +0000 UTC m=+3007.907056361" observedRunningTime="2025-11-22 03:42:52.600550248 +0000 UTC m=+3008.639072150" watchObservedRunningTime="2025-11-22 03:42:52.628447828 +0000 UTC m=+3008.666969720" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.648294 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b4c819b2-8950-4a27-9cd7-6bd6265477de","Type":"ContainerStarted","Data":"7c07a2c68c6f2f9a24bb932241cb2ef7e85e1f3f0e6ef79decb0d75ec9ea38e9"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.648352 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b4c819b2-8950-4a27-9cd7-6bd6265477de","Type":"ContainerStarted","Data":"09ee47e209e3b7936a116f7204ec83898c0d2873302975a8d3e32ce7d629de6e"} Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.653026 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.667572253 podStartE2EDuration="3.653007578s" podCreationTimestamp="2025-11-22 03:42:49 +0000 UTC" firstStartedPulling="2025-11-22 03:42:50.638694915 +0000 UTC m=+3006.677216807" lastFinishedPulling="2025-11-22 03:42:51.62413024 +0000 UTC m=+3007.662652132" observedRunningTime="2025-11-22 03:42:52.635264822 +0000 UTC m=+3008.673786714" watchObservedRunningTime="2025-11-22 03:42:52.653007578 +0000 UTC m=+3008.691529470" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.761149 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575b764d7f-w8r6x"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.780188 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d4dbc5d5b-4ppc9"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.785542 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.790460 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.799744 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.808374 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d4dbc5d5b-4ppc9"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.821466 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.840117 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b987c6977-c7m5n"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.865550 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-89cb6b448-l5wz8"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.867057 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.888060 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-89cb6b448-l5wz8"] Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900539 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-combined-ca-bundle\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900618 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-config-data\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900639 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-scripts\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-horizon-secret-key\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900761 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-combined-ca-bundle\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900831 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858fddfd-d272-4323-8c51-887b9e429b56-logs\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900865 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-horizon-tls-certs\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900891 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-secret-key\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chqmx\" (UniqueName: \"kubernetes.io/projected/858fddfd-d272-4323-8c51-887b9e429b56-kube-api-access-chqmx\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.900963 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-config-data\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.901093 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdd7j\" (UniqueName: \"kubernetes.io/projected/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-kube-api-access-bdd7j\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.901171 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-logs\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.901204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-scripts\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:52 crc kubenswrapper[4922]: I1122 03:42:52.901321 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-tls-certs\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003022 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-combined-ca-bundle\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003106 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858fddfd-d272-4323-8c51-887b9e429b56-logs\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003128 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-horizon-tls-certs\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-secret-key\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003178 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chqmx\" (UniqueName: \"kubernetes.io/projected/858fddfd-d272-4323-8c51-887b9e429b56-kube-api-access-chqmx\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003280 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-config-data\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003304 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdd7j\" (UniqueName: \"kubernetes.io/projected/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-kube-api-access-bdd7j\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003322 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-logs\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003343 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-scripts\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003368 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-tls-certs\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003412 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-combined-ca-bundle\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003435 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-config-data\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003452 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-scripts\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.003495 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-horizon-secret-key\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.005052 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858fddfd-d272-4323-8c51-887b9e429b56-logs\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.006326 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-config-data\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.011376 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-horizon-tls-certs\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.012441 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-config-data\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.021424 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-scripts\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.021759 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-logs\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.022150 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-scripts\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.022702 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-combined-ca-bundle\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.023231 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-tls-certs\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.031245 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-combined-ca-bundle\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.032923 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-secret-key\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.033830 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-horizon-secret-key\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.036328 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdd7j\" (UniqueName: \"kubernetes.io/projected/c0424cb6-8fea-4f3e-a293-27d3d2477c2f-kube-api-access-bdd7j\") pod \"horizon-89cb6b448-l5wz8\" (UID: \"c0424cb6-8fea-4f3e-a293-27d3d2477c2f\") " pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.051986 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chqmx\" (UniqueName: \"kubernetes.io/projected/858fddfd-d272-4323-8c51-887b9e429b56-kube-api-access-chqmx\") pod \"horizon-7d4dbc5d5b-4ppc9\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.196165 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.219799 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.277878 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gtphg" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.313165 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t4nj\" (UniqueName: \"kubernetes.io/projected/c570feea-6723-4626-a849-fc67db11ee3e-kube-api-access-6t4nj\") pod \"c570feea-6723-4626-a849-fc67db11ee3e\" (UID: \"c570feea-6723-4626-a849-fc67db11ee3e\") " Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.335541 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c570feea-6723-4626-a849-fc67db11ee3e-kube-api-access-6t4nj" (OuterVolumeSpecName: "kube-api-access-6t4nj") pod "c570feea-6723-4626-a849-fc67db11ee3e" (UID: "c570feea-6723-4626-a849-fc67db11ee3e"). InnerVolumeSpecName "kube-api-access-6t4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.418071 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t4nj\" (UniqueName: \"kubernetes.io/projected/c570feea-6723-4626-a849-fc67db11ee3e-kube-api-access-6t4nj\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.695650 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-gtphg" event={"ID":"c570feea-6723-4626-a849-fc67db11ee3e","Type":"ContainerDied","Data":"b9bf29218ecfb237aef1231e4e7903234dff430ff52f38fa85d0ba510df039dc"} Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.695687 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9bf29218ecfb237aef1231e4e7903234dff430ff52f38fa85d0ba510df039dc" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.695743 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-gtphg" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.706434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b4c819b2-8950-4a27-9cd7-6bd6265477de","Type":"ContainerStarted","Data":"3f746f047da75415e2850dfc91cd17ce924bc0e67918256198316b2fc84d6ea2"} Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.706613 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-log" containerID="cri-o://7c07a2c68c6f2f9a24bb932241cb2ef7e85e1f3f0e6ef79decb0d75ec9ea38e9" gracePeriod=30 Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.707078 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-httpd" containerID="cri-o://3f746f047da75415e2850dfc91cd17ce924bc0e67918256198316b2fc84d6ea2" gracePeriod=30 Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.716207 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-log" containerID="cri-o://8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159" gracePeriod=30 Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.716299 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"580b343c-5374-4eec-b414-cbe371f500a3","Type":"ContainerStarted","Data":"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf"} Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.716790 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-httpd" containerID="cri-o://585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf" gracePeriod=30 Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.731224 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.731210131 podStartE2EDuration="3.731210131s" podCreationTimestamp="2025-11-22 03:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:42:53.727740987 +0000 UTC m=+3009.766262879" watchObservedRunningTime="2025-11-22 03:42:53.731210131 +0000 UTC m=+3009.769732023" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.773745 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.773724692 podStartE2EDuration="3.773724692s" podCreationTimestamp="2025-11-22 03:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:42:53.753712122 +0000 UTC m=+3009.792234014" watchObservedRunningTime="2025-11-22 03:42:53.773724692 +0000 UTC m=+3009.812246584" Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.802257 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d4dbc5d5b-4ppc9"] Nov 22 03:42:53 crc kubenswrapper[4922]: I1122 03:42:53.810245 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-89cb6b448-l5wz8"] Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.474882 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.565262 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-combined-ca-bundle\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.565689 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-ceph\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.567376 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-internal-tls-certs\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.572454 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.575219 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-config-data\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.575498 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-scripts\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.575395 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.575804 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdcf\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-kube-api-access-jbdcf\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.576061 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-logs\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.576382 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-httpd-run\") pod \"580b343c-5374-4eec-b414-cbe371f500a3\" (UID: \"580b343c-5374-4eec-b414-cbe371f500a3\") " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.577665 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.577694 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-logs" (OuterVolumeSpecName: "logs") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.579611 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-ceph" (OuterVolumeSpecName: "ceph") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.579963 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-kube-api-access-jbdcf" (OuterVolumeSpecName: "kube-api-access-jbdcf") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "kube-api-access-jbdcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.582609 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.582728 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.582808 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdcf\" (UniqueName: \"kubernetes.io/projected/580b343c-5374-4eec-b414-cbe371f500a3-kube-api-access-jbdcf\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.582904 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.582977 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/580b343c-5374-4eec-b414-cbe371f500a3-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.586037 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-scripts" (OuterVolumeSpecName: "scripts") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.598057 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.645932 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-config-data" (OuterVolumeSpecName: "config-data") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.648126 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.674104 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "580b343c-5374-4eec-b414-cbe371f500a3" (UID: "580b343c-5374-4eec-b414-cbe371f500a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.684585 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.684611 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.684623 4922 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.684632 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.684642 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/580b343c-5374-4eec-b414-cbe371f500a3-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.741534 4922 generic.go:334] "Generic (PLEG): container finished" podID="580b343c-5374-4eec-b414-cbe371f500a3" containerID="585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf" exitCode=0 Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.741917 4922 generic.go:334] "Generic (PLEG): container finished" podID="580b343c-5374-4eec-b414-cbe371f500a3" containerID="8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159" exitCode=143 Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.741630 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.741599 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"580b343c-5374-4eec-b414-cbe371f500a3","Type":"ContainerDied","Data":"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.741997 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"580b343c-5374-4eec-b414-cbe371f500a3","Type":"ContainerDied","Data":"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.742040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"580b343c-5374-4eec-b414-cbe371f500a3","Type":"ContainerDied","Data":"e40550c9c68d231cbd4c9c2e6dcbb7e1e1ad993751df4499f86dd9c91b4dff07"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.742061 4922 scope.go:117] "RemoveContainer" containerID="585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.744399 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89cb6b448-l5wz8" event={"ID":"c0424cb6-8fea-4f3e-a293-27d3d2477c2f","Type":"ContainerStarted","Data":"b355171d32e6fd492a39c5744fa86e9dd6b23e7642b4cfd55d185eafb1e0e00c"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.758409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d4dbc5d5b-4ppc9" event={"ID":"858fddfd-d272-4323-8c51-887b9e429b56","Type":"ContainerStarted","Data":"b352ffe10e5e8f5f403fc935e016af5db55ec547fafe1ac0fad8bcda8a52d60a"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.762532 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerID="3f746f047da75415e2850dfc91cd17ce924bc0e67918256198316b2fc84d6ea2" exitCode=0 Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.762568 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerID="7c07a2c68c6f2f9a24bb932241cb2ef7e85e1f3f0e6ef79decb0d75ec9ea38e9" exitCode=143 Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.762587 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b4c819b2-8950-4a27-9cd7-6bd6265477de","Type":"ContainerDied","Data":"3f746f047da75415e2850dfc91cd17ce924bc0e67918256198316b2fc84d6ea2"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.762609 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b4c819b2-8950-4a27-9cd7-6bd6265477de","Type":"ContainerDied","Data":"7c07a2c68c6f2f9a24bb932241cb2ef7e85e1f3f0e6ef79decb0d75ec9ea38e9"} Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.777820 4922 scope.go:117] "RemoveContainer" containerID="8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.804833 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.838665 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.859634 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:54 crc kubenswrapper[4922]: E1122 03:42:54.860083 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-httpd" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.860094 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-httpd" Nov 22 03:42:54 crc kubenswrapper[4922]: E1122 03:42:54.860121 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-log" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.860127 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-log" Nov 22 03:42:54 crc kubenswrapper[4922]: E1122 03:42:54.860153 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c570feea-6723-4626-a849-fc67db11ee3e" containerName="mariadb-database-create" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.860159 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="c570feea-6723-4626-a849-fc67db11ee3e" containerName="mariadb-database-create" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.860331 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="c570feea-6723-4626-a849-fc67db11ee3e" containerName="mariadb-database-create" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.860350 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-log" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.860360 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="580b343c-5374-4eec-b414-cbe371f500a3" containerName="glance-httpd" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.861413 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.865501 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.866966 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.875360 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.902058 4922 scope.go:117] "RemoveContainer" containerID="585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf" Nov 22 03:42:54 crc kubenswrapper[4922]: E1122 03:42:54.905436 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf\": container with ID starting with 585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf not found: ID does not exist" containerID="585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.905474 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf"} err="failed to get container status \"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf\": rpc error: code = NotFound desc = could not find container \"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf\": container with ID starting with 585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf not found: ID does not exist" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.905496 4922 scope.go:117] "RemoveContainer" containerID="8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159" Nov 22 03:42:54 crc kubenswrapper[4922]: E1122 03:42:54.905741 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159\": container with ID starting with 8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159 not found: ID does not exist" containerID="8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.905762 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159"} err="failed to get container status \"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159\": rpc error: code = NotFound desc = could not find container \"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159\": container with ID starting with 8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159 not found: ID does not exist" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.905776 4922 scope.go:117] "RemoveContainer" containerID="585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.905942 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf"} err="failed to get container status \"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf\": rpc error: code = NotFound desc = could not find container \"585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf\": container with ID starting with 585d0b68d70ac953124cf6432e92ce34b56bc5cb5c6cb37e8db928792d604cdf not found: ID does not exist" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.905960 4922 scope.go:117] "RemoveContainer" containerID="8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.906105 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159"} err="failed to get container status \"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159\": rpc error: code = NotFound desc = could not find container \"8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159\": container with ID starting with 8377ec3dbae3bc0997ff1eed07a70e296529a64f17b961ac67221760652b6159 not found: ID does not exist" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.918571 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 22 03:42:54 crc kubenswrapper[4922]: I1122 03:42:54.926954 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000346 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000405 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000425 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/161577ba-f585-44ce-9a0e-cf06d8e134f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000484 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/161577ba-f585-44ce-9a0e-cf06d8e134f4-ceph\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000520 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/161577ba-f585-44ce-9a0e-cf06d8e134f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000546 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbpv\" (UniqueName: \"kubernetes.io/projected/161577ba-f585-44ce-9a0e-cf06d8e134f4-kube-api-access-dkbpv\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000625 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000657 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.000708 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.034087 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102327 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-logs\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102407 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-combined-ca-bundle\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102561 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-public-tls-certs\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102612 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9zsn\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-kube-api-access-j9zsn\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102634 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-scripts\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102674 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-httpd-run\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102706 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-ceph\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102747 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-config-data\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.102764 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b4c819b2-8950-4a27-9cd7-6bd6265477de\" (UID: \"b4c819b2-8950-4a27-9cd7-6bd6265477de\") " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103050 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-logs" (OuterVolumeSpecName: "logs") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103179 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbpv\" (UniqueName: \"kubernetes.io/projected/161577ba-f585-44ce-9a0e-cf06d8e134f4-kube-api-access-dkbpv\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103240 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103378 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103442 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103469 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103485 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/161577ba-f585-44ce-9a0e-cf06d8e134f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103537 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/161577ba-f585-44ce-9a0e-cf06d8e134f4-ceph\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103588 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/161577ba-f585-44ce-9a0e-cf06d8e134f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103640 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.104140 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/161577ba-f585-44ce-9a0e-cf06d8e134f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.104390 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/161577ba-f585-44ce-9a0e-cf06d8e134f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.107242 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.103255 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.107605 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-ceph" (OuterVolumeSpecName: "ceph") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.108199 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-scripts" (OuterVolumeSpecName: "scripts") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.110175 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.110188 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-kube-api-access-j9zsn" (OuterVolumeSpecName: "kube-api-access-j9zsn") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "kube-api-access-j9zsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.111171 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.111622 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.112075 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.112134 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/161577ba-f585-44ce-9a0e-cf06d8e134f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.118380 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/161577ba-f585-44ce-9a0e-cf06d8e134f4-ceph\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.121829 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbpv\" (UniqueName: \"kubernetes.io/projected/161577ba-f585-44ce-9a0e-cf06d8e134f4-kube-api-access-dkbpv\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.152653 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.165257 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.167190 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"161577ba-f585-44ce-9a0e-cf06d8e134f4\") " pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205116 4922 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205143 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9zsn\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-kube-api-access-j9zsn\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205155 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205164 4922 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4c819b2-8950-4a27-9cd7-6bd6265477de-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205172 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b4c819b2-8950-4a27-9cd7-6bd6265477de-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205197 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.205208 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.206755 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.210097 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-config-data" (OuterVolumeSpecName: "config-data") pod "b4c819b2-8950-4a27-9cd7-6bd6265477de" (UID: "b4c819b2-8950-4a27-9cd7-6bd6265477de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.223414 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.306377 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4c819b2-8950-4a27-9cd7-6bd6265477de-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.306400 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.325726 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580b343c-5374-4eec-b414-cbe371f500a3" path="/var/lib/kubelet/pods/580b343c-5374-4eec-b414-cbe371f500a3/volumes" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.781353 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b4c819b2-8950-4a27-9cd7-6bd6265477de","Type":"ContainerDied","Data":"09ee47e209e3b7936a116f7204ec83898c0d2873302975a8d3e32ce7d629de6e"} Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.782901 4922 scope.go:117] "RemoveContainer" containerID="3f746f047da75415e2850dfc91cd17ce924bc0e67918256198316b2fc84d6ea2" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.783035 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.810517 4922 scope.go:117] "RemoveContainer" containerID="7c07a2c68c6f2f9a24bb932241cb2ef7e85e1f3f0e6ef79decb0d75ec9ea38e9" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.824983 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.831484 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.850008 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:55 crc kubenswrapper[4922]: E1122 03:42:55.850477 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-log" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.850495 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-log" Nov 22 03:42:55 crc kubenswrapper[4922]: E1122 03:42:55.850517 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-httpd" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.850524 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-httpd" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.850720 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-log" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.850748 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" containerName="glance-httpd" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.851786 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.863588 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.863896 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.874983 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.891106 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938133 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938266 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f334bb7-e931-428b-a3b6-c576b9106f7d-logs\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938479 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27gm\" (UniqueName: \"kubernetes.io/projected/4f334bb7-e931-428b-a3b6-c576b9106f7d-kube-api-access-k27gm\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938550 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938864 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4f334bb7-e931-428b-a3b6-c576b9106f7d-ceph\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f334bb7-e931-428b-a3b6-c576b9106f7d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.938989 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.939131 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:55 crc kubenswrapper[4922]: I1122 03:42:55.939199 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.044973 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f334bb7-e931-428b-a3b6-c576b9106f7d-logs\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045058 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27gm\" (UniqueName: \"kubernetes.io/projected/4f334bb7-e931-428b-a3b6-c576b9106f7d-kube-api-access-k27gm\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045090 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045120 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4f334bb7-e931-428b-a3b6-c576b9106f7d-ceph\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045149 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f334bb7-e931-428b-a3b6-c576b9106f7d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045185 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045241 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045274 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045363 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045499 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f334bb7-e931-428b-a3b6-c576b9106f7d-logs\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.045806 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f334bb7-e931-428b-a3b6-c576b9106f7d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.046515 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.065033 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4f334bb7-e931-428b-a3b6-c576b9106f7d-ceph\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.065434 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.067781 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.077126 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.085470 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f334bb7-e931-428b-a3b6-c576b9106f7d-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.091771 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27gm\" (UniqueName: \"kubernetes.io/projected/4f334bb7-e931-428b-a3b6-c576b9106f7d-kube-api-access-k27gm\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.102414 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"4f334bb7-e931-428b-a3b6-c576b9106f7d\") " pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.231618 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.768043 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 03:42:56 crc kubenswrapper[4922]: W1122 03:42:56.780984 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f334bb7_e931_428b_a3b6_c576b9106f7d.slice/crio-7cb92d6d3b10ff4fb6a9e5372bba70f2847a65db78be4de176d6d9f376d540f8 WatchSource:0}: Error finding container 7cb92d6d3b10ff4fb6a9e5372bba70f2847a65db78be4de176d6d9f376d540f8: Status 404 returned error can't find the container with id 7cb92d6d3b10ff4fb6a9e5372bba70f2847a65db78be4de176d6d9f376d540f8 Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.802282 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"161577ba-f585-44ce-9a0e-cf06d8e134f4","Type":"ContainerStarted","Data":"8a9cb5406f8b9fd3db1f0d4f71a1e2cbda403a6b05148a36d0406f10bc32c8f1"} Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.802541 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"161577ba-f585-44ce-9a0e-cf06d8e134f4","Type":"ContainerStarted","Data":"b14147ae4f2fa258c65fce755d857c2b192b3c0186023328499292bed01c4b57"} Nov 22 03:42:56 crc kubenswrapper[4922]: I1122 03:42:56.809119 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f334bb7-e931-428b-a3b6-c576b9106f7d","Type":"ContainerStarted","Data":"7cb92d6d3b10ff4fb6a9e5372bba70f2847a65db78be4de176d6d9f376d540f8"} Nov 22 03:42:57 crc kubenswrapper[4922]: I1122 03:42:57.323364 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c819b2-8950-4a27-9cd7-6bd6265477de" path="/var/lib/kubelet/pods/b4c819b2-8950-4a27-9cd7-6bd6265477de/volumes" Nov 22 03:42:57 crc kubenswrapper[4922]: I1122 03:42:57.820434 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f334bb7-e931-428b-a3b6-c576b9106f7d","Type":"ContainerStarted","Data":"91a44bb81d2426ec668fc92df4b461767597719b6f5c533d319822cdcc57aabc"} Nov 22 03:42:57 crc kubenswrapper[4922]: I1122 03:42:57.824666 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"161577ba-f585-44ce-9a0e-cf06d8e134f4","Type":"ContainerStarted","Data":"e4d13d00854fd19a636d45279500bec1103b047bb05f703a49f1790499a5f81d"} Nov 22 03:42:57 crc kubenswrapper[4922]: I1122 03:42:57.851798 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.851780894 podStartE2EDuration="3.851780894s" podCreationTimestamp="2025-11-22 03:42:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:42:57.850071273 +0000 UTC m=+3013.888593155" watchObservedRunningTime="2025-11-22 03:42:57.851780894 +0000 UTC m=+3013.890302786" Nov 22 03:42:58 crc kubenswrapper[4922]: I1122 03:42:58.301171 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:42:58 crc kubenswrapper[4922]: E1122 03:42:58.301542 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.114475 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.134574 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.188598 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-13c6-account-create-2dlsc"] Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.190319 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.195050 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.203679 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-13c6-account-create-2dlsc"] Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.351065 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kvsd\" (UniqueName: \"kubernetes.io/projected/da26cef4-8297-4b18-a465-3d69e4ae01f8-kube-api-access-2kvsd\") pod \"manila-13c6-account-create-2dlsc\" (UID: \"da26cef4-8297-4b18-a465-3d69e4ae01f8\") " pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.453330 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kvsd\" (UniqueName: \"kubernetes.io/projected/da26cef4-8297-4b18-a465-3d69e4ae01f8-kube-api-access-2kvsd\") pod \"manila-13c6-account-create-2dlsc\" (UID: \"da26cef4-8297-4b18-a465-3d69e4ae01f8\") " pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.481499 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kvsd\" (UniqueName: \"kubernetes.io/projected/da26cef4-8297-4b18-a465-3d69e4ae01f8-kube-api-access-2kvsd\") pod \"manila-13c6-account-create-2dlsc\" (UID: \"da26cef4-8297-4b18-a465-3d69e4ae01f8\") " pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:00 crc kubenswrapper[4922]: I1122 03:43:00.527982 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.232050 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-13c6-account-create-2dlsc"] Nov 22 03:43:03 crc kubenswrapper[4922]: W1122 03:43:03.290285 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda26cef4_8297_4b18_a465_3d69e4ae01f8.slice/crio-40fdb2027cae50bf7d6419617943798f1e9221d4d43a885ea2cd093ac7fe82f0 WatchSource:0}: Error finding container 40fdb2027cae50bf7d6419617943798f1e9221d4d43a885ea2cd093ac7fe82f0: Status 404 returned error can't find the container with id 40fdb2027cae50bf7d6419617943798f1e9221d4d43a885ea2cd093ac7fe82f0 Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.905932 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f334bb7-e931-428b-a3b6-c576b9106f7d","Type":"ContainerStarted","Data":"9917ca345b1bec832add17fdba45c211680caea3af6a7c72edb0f618856d93c4"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.907751 4922 generic.go:334] "Generic (PLEG): container finished" podID="da26cef4-8297-4b18-a465-3d69e4ae01f8" containerID="a575d9415469463145ffa9f96922a683786ebd802710278e2df33bcf76d590e6" exitCode=0 Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.907817 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-13c6-account-create-2dlsc" event={"ID":"da26cef4-8297-4b18-a465-3d69e4ae01f8","Type":"ContainerDied","Data":"a575d9415469463145ffa9f96922a683786ebd802710278e2df33bcf76d590e6"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.907881 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-13c6-account-create-2dlsc" event={"ID":"da26cef4-8297-4b18-a465-3d69e4ae01f8","Type":"ContainerStarted","Data":"40fdb2027cae50bf7d6419617943798f1e9221d4d43a885ea2cd093ac7fe82f0"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.910112 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89cb6b448-l5wz8" event={"ID":"c0424cb6-8fea-4f3e-a293-27d3d2477c2f","Type":"ContainerStarted","Data":"c3803f819b12c27e1ec11628c3f49f51bca418ce03f640c261d13d12b745d8e9"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.912293 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d4dbc5d5b-4ppc9" event={"ID":"858fddfd-d272-4323-8c51-887b9e429b56","Type":"ContainerStarted","Data":"3de2ae7d25f3a188f43d026c82c62adb0e8b1e72d6c7f472ec2a56b7a417f58a"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.914901 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b987c6977-c7m5n" event={"ID":"b4287b67-a4f2-4574-b73f-2995595f2199","Type":"ContainerStarted","Data":"9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.914930 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b987c6977-c7m5n" event={"ID":"b4287b67-a4f2-4574-b73f-2995595f2199","Type":"ContainerStarted","Data":"c0f5bf890ed771313bcf8b10a93419efecc908f9011cd5316768cd8f7d6849a7"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.915069 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b987c6977-c7m5n" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon-log" containerID="cri-o://c0f5bf890ed771313bcf8b10a93419efecc908f9011cd5316768cd8f7d6849a7" gracePeriod=30 Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.915082 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b987c6977-c7m5n" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon" containerID="cri-o://9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69" gracePeriod=30 Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.917835 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b764d7f-w8r6x" event={"ID":"d8cf55ca-4cba-4931-a9e6-021fa6e53669","Type":"ContainerStarted","Data":"430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.917875 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b764d7f-w8r6x" event={"ID":"d8cf55ca-4cba-4931-a9e6-021fa6e53669","Type":"ContainerStarted","Data":"fec82b2e00a08287fbc59f4fa83381598e1e396874b36f29c306c946c551a650"} Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.918012 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-575b764d7f-w8r6x" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon-log" containerID="cri-o://fec82b2e00a08287fbc59f4fa83381598e1e396874b36f29c306c946c551a650" gracePeriod=30 Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.918051 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-575b764d7f-w8r6x" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon" containerID="cri-o://430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded" gracePeriod=30 Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.931539 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.931522257 podStartE2EDuration="8.931522257s" podCreationTimestamp="2025-11-22 03:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:43:03.930606085 +0000 UTC m=+3019.969127977" watchObservedRunningTime="2025-11-22 03:43:03.931522257 +0000 UTC m=+3019.970044149" Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.963962 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b987c6977-c7m5n" podStartSLOduration=2.840941911 podStartE2EDuration="13.963926035s" podCreationTimestamp="2025-11-22 03:42:50 +0000 UTC" firstStartedPulling="2025-11-22 03:42:51.565665276 +0000 UTC m=+3007.604187168" lastFinishedPulling="2025-11-22 03:43:02.6886494 +0000 UTC m=+3018.727171292" observedRunningTime="2025-11-22 03:43:03.957375897 +0000 UTC m=+3019.995897789" watchObservedRunningTime="2025-11-22 03:43:03.963926035 +0000 UTC m=+3020.002447927" Nov 22 03:43:03 crc kubenswrapper[4922]: I1122 03:43:03.979110 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-575b764d7f-w8r6x" podStartSLOduration=2.313076014 podStartE2EDuration="13.979087749s" podCreationTimestamp="2025-11-22 03:42:50 +0000 UTC" firstStartedPulling="2025-11-22 03:42:51.05242561 +0000 UTC m=+3007.090947502" lastFinishedPulling="2025-11-22 03:43:02.718437345 +0000 UTC m=+3018.756959237" observedRunningTime="2025-11-22 03:43:03.978906295 +0000 UTC m=+3020.017428187" watchObservedRunningTime="2025-11-22 03:43:03.979087749 +0000 UTC m=+3020.017609641" Nov 22 03:43:04 crc kubenswrapper[4922]: I1122 03:43:04.950406 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-89cb6b448-l5wz8" event={"ID":"c0424cb6-8fea-4f3e-a293-27d3d2477c2f","Type":"ContainerStarted","Data":"87e4ee4280a22360d1c6666b062710643d489e71d900e6a548294a996ff36404"} Nov 22 03:43:04 crc kubenswrapper[4922]: I1122 03:43:04.957021 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d4dbc5d5b-4ppc9" event={"ID":"858fddfd-d272-4323-8c51-887b9e429b56","Type":"ContainerStarted","Data":"4c971541c2188d2f1e9d94e917b91931e33e5da1345a7f71a81e45b051b7161d"} Nov 22 03:43:04 crc kubenswrapper[4922]: I1122 03:43:04.990614 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-89cb6b448-l5wz8" podStartSLOduration=3.9886493720000002 podStartE2EDuration="12.99059717s" podCreationTimestamp="2025-11-22 03:42:52 +0000 UTC" firstStartedPulling="2025-11-22 03:42:53.812881672 +0000 UTC m=+3009.851403564" lastFinishedPulling="2025-11-22 03:43:02.81482947 +0000 UTC m=+3018.853351362" observedRunningTime="2025-11-22 03:43:04.989588795 +0000 UTC m=+3021.028110717" watchObservedRunningTime="2025-11-22 03:43:04.99059717 +0000 UTC m=+3021.029119062" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.024063 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podStartSLOduration=4.142412184 podStartE2EDuration="13.024046953s" podCreationTimestamp="2025-11-22 03:42:52 +0000 UTC" firstStartedPulling="2025-11-22 03:42:53.820451904 +0000 UTC m=+3009.858973796" lastFinishedPulling="2025-11-22 03:43:02.702086673 +0000 UTC m=+3018.740608565" observedRunningTime="2025-11-22 03:43:05.016289857 +0000 UTC m=+3021.054811789" watchObservedRunningTime="2025-11-22 03:43:05.024046953 +0000 UTC m=+3021.062568845" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.207136 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.207652 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.239109 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.254957 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.377418 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.469932 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kvsd\" (UniqueName: \"kubernetes.io/projected/da26cef4-8297-4b18-a465-3d69e4ae01f8-kube-api-access-2kvsd\") pod \"da26cef4-8297-4b18-a465-3d69e4ae01f8\" (UID: \"da26cef4-8297-4b18-a465-3d69e4ae01f8\") " Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.481315 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da26cef4-8297-4b18-a465-3d69e4ae01f8-kube-api-access-2kvsd" (OuterVolumeSpecName: "kube-api-access-2kvsd") pod "da26cef4-8297-4b18-a465-3d69e4ae01f8" (UID: "da26cef4-8297-4b18-a465-3d69e4ae01f8"). InnerVolumeSpecName "kube-api-access-2kvsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.572096 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kvsd\" (UniqueName: \"kubernetes.io/projected/da26cef4-8297-4b18-a465-3d69e4ae01f8-kube-api-access-2kvsd\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.971697 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-13c6-account-create-2dlsc" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.971703 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-13c6-account-create-2dlsc" event={"ID":"da26cef4-8297-4b18-a465-3d69e4ae01f8","Type":"ContainerDied","Data":"40fdb2027cae50bf7d6419617943798f1e9221d4d43a885ea2cd093ac7fe82f0"} Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.973885 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40fdb2027cae50bf7d6419617943798f1e9221d4d43a885ea2cd093ac7fe82f0" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.975510 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:05 crc kubenswrapper[4922]: I1122 03:43:05.975814 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:06 crc kubenswrapper[4922]: I1122 03:43:06.233058 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 03:43:06 crc kubenswrapper[4922]: I1122 03:43:06.234148 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 03:43:06 crc kubenswrapper[4922]: I1122 03:43:06.270552 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 03:43:06 crc kubenswrapper[4922]: I1122 03:43:06.285718 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 03:43:06 crc kubenswrapper[4922]: I1122 03:43:06.983210 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 03:43:06 crc kubenswrapper[4922]: I1122 03:43:06.983364 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 03:43:07 crc kubenswrapper[4922]: I1122 03:43:07.993431 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 03:43:07 crc kubenswrapper[4922]: I1122 03:43:07.994896 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 03:43:09 crc kubenswrapper[4922]: I1122 03:43:09.001469 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.564703 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.736993 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-wfl57"] Nov 22 03:43:10 crc kubenswrapper[4922]: E1122 03:43:10.738001 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da26cef4-8297-4b18-a465-3d69e4ae01f8" containerName="mariadb-account-create" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.738106 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="da26cef4-8297-4b18-a465-3d69e4ae01f8" containerName="mariadb-account-create" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.738415 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="da26cef4-8297-4b18-a465-3d69e4ae01f8" containerName="mariadb-account-create" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.739312 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.743153 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-9ftks" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.747065 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.752009 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-wfl57"] Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.811621 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-job-config-data\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.811815 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mvhv\" (UniqueName: \"kubernetes.io/projected/f7904f33-eb73-4d09-b13f-4faf6ec90328-kube-api-access-7mvhv\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.811912 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-config-data\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.811959 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-combined-ca-bundle\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.823892 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.914944 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mvhv\" (UniqueName: \"kubernetes.io/projected/f7904f33-eb73-4d09-b13f-4faf6ec90328-kube-api-access-7mvhv\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.915014 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-config-data\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.915061 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-combined-ca-bundle\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.916237 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-job-config-data\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.922788 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-config-data\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.923440 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-combined-ca-bundle\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.939905 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-job-config-data\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:10 crc kubenswrapper[4922]: I1122 03:43:10.942051 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mvhv\" (UniqueName: \"kubernetes.io/projected/f7904f33-eb73-4d09-b13f-4faf6ec90328-kube-api-access-7mvhv\") pod \"manila-db-sync-wfl57\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:11 crc kubenswrapper[4922]: I1122 03:43:11.072022 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:11 crc kubenswrapper[4922]: I1122 03:43:11.753349 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-wfl57"] Nov 22 03:43:12 crc kubenswrapper[4922]: I1122 03:43:12.029990 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wfl57" event={"ID":"f7904f33-eb73-4d09-b13f-4faf6ec90328","Type":"ContainerStarted","Data":"5cd282f87c9091ae001d1ab0978d3a4df6e299ec09851e402930280718ce9fe4"} Nov 22 03:43:12 crc kubenswrapper[4922]: I1122 03:43:12.852759 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 03:43:12 crc kubenswrapper[4922]: I1122 03:43:12.857203 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 03:43:12 crc kubenswrapper[4922]: I1122 03:43:12.897594 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:12 crc kubenswrapper[4922]: I1122 03:43:12.897935 4922 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 03:43:12 crc kubenswrapper[4922]: I1122 03:43:12.913612 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.200616 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.200654 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.202587 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.238:8443: connect: connection refused" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.222922 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.222980 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.225176 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-89cb6b448-l5wz8" podUID="c0424cb6-8fea-4f3e-a293-27d3d2477c2f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Nov 22 03:43:13 crc kubenswrapper[4922]: I1122 03:43:13.302492 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:43:13 crc kubenswrapper[4922]: E1122 03:43:13.302753 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:43:18 crc kubenswrapper[4922]: I1122 03:43:18.102956 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wfl57" event={"ID":"f7904f33-eb73-4d09-b13f-4faf6ec90328","Type":"ContainerStarted","Data":"dfc1b5b594bee7f44fe372cacbf5ee614e5400f7ae9f7868f07414417010e878"} Nov 22 03:43:18 crc kubenswrapper[4922]: I1122 03:43:18.125021 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-wfl57" podStartSLOduration=2.273456835 podStartE2EDuration="8.124994468s" podCreationTimestamp="2025-11-22 03:43:10 +0000 UTC" firstStartedPulling="2025-11-22 03:43:11.727916644 +0000 UTC m=+3027.766438536" lastFinishedPulling="2025-11-22 03:43:17.579454277 +0000 UTC m=+3033.617976169" observedRunningTime="2025-11-22 03:43:18.116884833 +0000 UTC m=+3034.155406785" watchObservedRunningTime="2025-11-22 03:43:18.124994468 +0000 UTC m=+3034.163516400" Nov 22 03:43:23 crc kubenswrapper[4922]: I1122 03:43:23.197672 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.238:8443: connect: connection refused" Nov 22 03:43:23 crc kubenswrapper[4922]: I1122 03:43:23.221440 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-89cb6b448-l5wz8" podUID="c0424cb6-8fea-4f3e-a293-27d3d2477c2f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.239:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.239:8443: connect: connection refused" Nov 22 03:43:24 crc kubenswrapper[4922]: I1122 03:43:24.301507 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:43:24 crc kubenswrapper[4922]: E1122 03:43:24.302332 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:43:31 crc kubenswrapper[4922]: I1122 03:43:31.271792 4922 generic.go:334] "Generic (PLEG): container finished" podID="f7904f33-eb73-4d09-b13f-4faf6ec90328" containerID="dfc1b5b594bee7f44fe372cacbf5ee614e5400f7ae9f7868f07414417010e878" exitCode=0 Nov 22 03:43:31 crc kubenswrapper[4922]: I1122 03:43:31.271934 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wfl57" event={"ID":"f7904f33-eb73-4d09-b13f-4faf6ec90328","Type":"ContainerDied","Data":"dfc1b5b594bee7f44fe372cacbf5ee614e5400f7ae9f7868f07414417010e878"} Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.700075 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.902655 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mvhv\" (UniqueName: \"kubernetes.io/projected/f7904f33-eb73-4d09-b13f-4faf6ec90328-kube-api-access-7mvhv\") pod \"f7904f33-eb73-4d09-b13f-4faf6ec90328\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.902811 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-job-config-data\") pod \"f7904f33-eb73-4d09-b13f-4faf6ec90328\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.902909 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-config-data\") pod \"f7904f33-eb73-4d09-b13f-4faf6ec90328\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.903054 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-combined-ca-bundle\") pod \"f7904f33-eb73-4d09-b13f-4faf6ec90328\" (UID: \"f7904f33-eb73-4d09-b13f-4faf6ec90328\") " Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.908958 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "f7904f33-eb73-4d09-b13f-4faf6ec90328" (UID: "f7904f33-eb73-4d09-b13f-4faf6ec90328"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.909357 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7904f33-eb73-4d09-b13f-4faf6ec90328-kube-api-access-7mvhv" (OuterVolumeSpecName: "kube-api-access-7mvhv") pod "f7904f33-eb73-4d09-b13f-4faf6ec90328" (UID: "f7904f33-eb73-4d09-b13f-4faf6ec90328"). InnerVolumeSpecName "kube-api-access-7mvhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.911987 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-config-data" (OuterVolumeSpecName: "config-data") pod "f7904f33-eb73-4d09-b13f-4faf6ec90328" (UID: "f7904f33-eb73-4d09-b13f-4faf6ec90328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:32 crc kubenswrapper[4922]: I1122 03:43:32.940228 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7904f33-eb73-4d09-b13f-4faf6ec90328" (UID: "f7904f33-eb73-4d09-b13f-4faf6ec90328"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.004825 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.004885 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.004899 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mvhv\" (UniqueName: \"kubernetes.io/projected/f7904f33-eb73-4d09-b13f-4faf6ec90328-kube-api-access-7mvhv\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.004915 4922 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/f7904f33-eb73-4d09-b13f-4faf6ec90328-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.289981 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-wfl57" event={"ID":"f7904f33-eb73-4d09-b13f-4faf6ec90328","Type":"ContainerDied","Data":"5cd282f87c9091ae001d1ab0978d3a4df6e299ec09851e402930280718ce9fe4"} Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.290023 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd282f87c9091ae001d1ab0978d3a4df6e299ec09851e402930280718ce9fe4" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.290075 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-wfl57" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.504549 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:33 crc kubenswrapper[4922]: E1122 03:43:33.505216 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7904f33-eb73-4d09-b13f-4faf6ec90328" containerName="manila-db-sync" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.505261 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7904f33-eb73-4d09-b13f-4faf6ec90328" containerName="manila-db-sync" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.505642 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7904f33-eb73-4d09-b13f-4faf6ec90328" containerName="manila-db-sync" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.507068 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.510877 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.511905 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.512793 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.512951 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-9ftks" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.520897 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.615298 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.616667 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc47a6e0-943d-4503-b268-0ad96699e17c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.616724 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8cgv\" (UniqueName: \"kubernetes.io/projected/fc47a6e0-943d-4503-b268-0ad96699e17c-kube-api-access-m8cgv\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.617067 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.617177 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-scripts\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.617198 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.617214 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.619160 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.621465 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.626745 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.719954 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-scripts\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.719991 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.720012 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.720093 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc47a6e0-943d-4503-b268-0ad96699e17c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.720118 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8cgv\" (UniqueName: \"kubernetes.io/projected/fc47a6e0-943d-4503-b268-0ad96699e17c-kube-api-access-m8cgv\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.720189 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.722104 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc47a6e0-943d-4503-b268-0ad96699e17c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.726801 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-scripts\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.728408 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.729613 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.732360 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.746550 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8cgv\" (UniqueName: \"kubernetes.io/projected/fc47a6e0-943d-4503-b268-0ad96699e17c-kube-api-access-m8cgv\") pod \"manila-scheduler-0\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.782533 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-9hspz"] Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.790192 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.821512 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-9hspz"] Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825632 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825689 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chkjk\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-kube-api-access-chkjk\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825723 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8f4x\" (UniqueName: \"kubernetes.io/projected/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-kube-api-access-c8f4x\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825754 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825776 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825801 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825835 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825926 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-config\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825965 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-scripts\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.825985 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.826004 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-ceph\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.826033 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.826049 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.877333 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.932958 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-scripts\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933017 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933048 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-ceph\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933088 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933104 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933147 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933170 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chkjk\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-kube-api-access-chkjk\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933199 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8f4x\" (UniqueName: \"kubernetes.io/projected/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-kube-api-access-c8f4x\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933218 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933236 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933257 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933295 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933342 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.933359 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-config\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.934354 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-config\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.938351 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.941739 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.942398 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.942990 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.943584 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.945557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.947298 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.948304 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-ceph\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.949141 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-scripts\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.951567 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.974557 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8f4x\" (UniqueName: \"kubernetes.io/projected/b009e973-c6d1-4eca-a06a-ed15c5ec10ad-kube-api-access-c8f4x\") pod \"dnsmasq-dns-76b5fdb995-9hspz\" (UID: \"b009e973-c6d1-4eca-a06a-ed15c5ec10ad\") " pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.979160 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:33 crc kubenswrapper[4922]: I1122 03:43:33.979421 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chkjk\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-kube-api-access-chkjk\") pod \"manila-share-share1-0\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " pod="openstack/manila-share-share1-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.105186 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.134169 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.142888 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.163420 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.202303 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.249562 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.253407 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data-custom\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.253447 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a56a7e-73d3-4545-b600-abc78cc08d9b-etc-machine-id\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.253658 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.254109 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.254136 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-scripts\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.254426 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbr8v\" (UniqueName: \"kubernetes.io/projected/a2a56a7e-73d3-4545-b600-abc78cc08d9b-kube-api-access-sbr8v\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:34 crc kubenswrapper[4922]: I1122 03:43:34.254451 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a56a7e-73d3-4545-b600-abc78cc08d9b-logs\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.346361 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4287b67-a4f2-4574-b73f-2995595f2199" containerID="c0f5bf890ed771313bcf8b10a93419efecc908f9011cd5316768cd8f7d6849a7" exitCode=137 Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.346426 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b987c6977-c7m5n" event={"ID":"b4287b67-a4f2-4574-b73f-2995595f2199","Type":"ContainerDied","Data":"c0f5bf890ed771313bcf8b10a93419efecc908f9011cd5316768cd8f7d6849a7"} Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.356264 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.356377 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-scripts\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.357146 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbr8v\" (UniqueName: \"kubernetes.io/projected/a2a56a7e-73d3-4545-b600-abc78cc08d9b-kube-api-access-sbr8v\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.357175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a56a7e-73d3-4545-b600-abc78cc08d9b-logs\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.357255 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data-custom\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.357275 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a56a7e-73d3-4545-b600-abc78cc08d9b-etc-machine-id\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.357317 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.359130 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a56a7e-73d3-4545-b600-abc78cc08d9b-logs\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.364533 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.364731 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a56a7e-73d3-4545-b600-abc78cc08d9b-etc-machine-id\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.366015 4922 generic.go:334] "Generic (PLEG): container finished" podID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerID="fec82b2e00a08287fbc59f4fa83381598e1e396874b36f29c306c946c551a650" exitCode=137 Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.366060 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b764d7f-w8r6x" event={"ID":"d8cf55ca-4cba-4931-a9e6-021fa6e53669","Type":"ContainerDied","Data":"fec82b2e00a08287fbc59f4fa83381598e1e396874b36f29c306c946c551a650"} Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.367430 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data-custom\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.369335 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-scripts\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.370130 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.398523 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbr8v\" (UniqueName: \"kubernetes.io/projected/a2a56a7e-73d3-4545-b600-abc78cc08d9b-kube-api-access-sbr8v\") pod \"manila-api-0\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: E1122 03:43:34.452135 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8cf55ca_4cba_4931_a9e6_021fa6e53669.slice/crio-conmon-fec82b2e00a08287fbc59f4fa83381598e1e396874b36f29c306c946c551a650.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4287b67_a4f2_4574_b73f_2995595f2199.slice/crio-conmon-9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4287b67_a4f2_4574_b73f_2995595f2199.slice/crio-conmon-c0f5bf890ed771313bcf8b10a93419efecc908f9011cd5316768cd8f7d6849a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8cf55ca_4cba_4931_a9e6_021fa6e53669.slice/crio-conmon-430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8cf55ca_4cba_4931_a9e6_021fa6e53669.slice/crio-430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4287b67_a4f2_4574_b73f_2995595f2199.slice/crio-9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69.scope\": RecentStats: unable to find data in memory cache]" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:34.492980 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.305591 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:43:35 crc kubenswrapper[4922]: E1122 03:43:35.306110 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.393471 4922 generic.go:334] "Generic (PLEG): container finished" podID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerID="430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded" exitCode=137 Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.393532 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b764d7f-w8r6x" event={"ID":"d8cf55ca-4cba-4931-a9e6-021fa6e53669","Type":"ContainerDied","Data":"430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded"} Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.408280 4922 generic.go:334] "Generic (PLEG): container finished" podID="b4287b67-a4f2-4574-b73f-2995595f2199" containerID="9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69" exitCode=137 Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.408335 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b987c6977-c7m5n" event={"ID":"b4287b67-a4f2-4574-b73f-2995595f2199","Type":"ContainerDied","Data":"9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69"} Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.619473 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.707293 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.716454 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-9hspz"] Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.836146 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:35 crc kubenswrapper[4922]: W1122 03:43:35.848512 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a56a7e_73d3_4545_b600_abc78cc08d9b.slice/crio-ea4ebf3303a3362903bc4b2e385cfeefee88616012b18e580b9454dd421ee170 WatchSource:0}: Error finding container ea4ebf3303a3362903bc4b2e385cfeefee88616012b18e580b9454dd421ee170: Status 404 returned error can't find the container with id ea4ebf3303a3362903bc4b2e385cfeefee88616012b18e580b9454dd421ee170 Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.899516 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-config-data\") pod \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.899633 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x9fj\" (UniqueName: \"kubernetes.io/projected/d8cf55ca-4cba-4931-a9e6-021fa6e53669-kube-api-access-5x9fj\") pod \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.899671 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cf55ca-4cba-4931-a9e6-021fa6e53669-logs\") pod \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.899691 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8cf55ca-4cba-4931-a9e6-021fa6e53669-horizon-secret-key\") pod \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.899769 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-scripts\") pod \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\" (UID: \"d8cf55ca-4cba-4931-a9e6-021fa6e53669\") " Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.900703 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8cf55ca-4cba-4931-a9e6-021fa6e53669-logs" (OuterVolumeSpecName: "logs") pod "d8cf55ca-4cba-4931-a9e6-021fa6e53669" (UID: "d8cf55ca-4cba-4931-a9e6-021fa6e53669"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.907974 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8cf55ca-4cba-4931-a9e6-021fa6e53669-kube-api-access-5x9fj" (OuterVolumeSpecName: "kube-api-access-5x9fj") pod "d8cf55ca-4cba-4931-a9e6-021fa6e53669" (UID: "d8cf55ca-4cba-4931-a9e6-021fa6e53669"). InnerVolumeSpecName "kube-api-access-5x9fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.914035 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8cf55ca-4cba-4931-a9e6-021fa6e53669-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d8cf55ca-4cba-4931-a9e6-021fa6e53669" (UID: "d8cf55ca-4cba-4931-a9e6-021fa6e53669"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.956253 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-scripts" (OuterVolumeSpecName: "scripts") pod "d8cf55ca-4cba-4931-a9e6-021fa6e53669" (UID: "d8cf55ca-4cba-4931-a9e6-021fa6e53669"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.987092 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:43:35 crc kubenswrapper[4922]: I1122 03:43:35.987685 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-config-data" (OuterVolumeSpecName: "config-data") pod "d8cf55ca-4cba-4931-a9e6-021fa6e53669" (UID: "d8cf55ca-4cba-4931-a9e6-021fa6e53669"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.002217 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.002249 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x9fj\" (UniqueName: \"kubernetes.io/projected/d8cf55ca-4cba-4931-a9e6-021fa6e53669-kube-api-access-5x9fj\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.002260 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8cf55ca-4cba-4931-a9e6-021fa6e53669-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.002270 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d8cf55ca-4cba-4931-a9e6-021fa6e53669-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.002279 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8cf55ca-4cba-4931-a9e6-021fa6e53669-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.104031 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-scripts\") pod \"b4287b67-a4f2-4574-b73f-2995595f2199\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.104254 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4287b67-a4f2-4574-b73f-2995595f2199-logs\") pod \"b4287b67-a4f2-4574-b73f-2995595f2199\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.104355 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4287b67-a4f2-4574-b73f-2995595f2199-horizon-secret-key\") pod \"b4287b67-a4f2-4574-b73f-2995595f2199\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.104463 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-config-data\") pod \"b4287b67-a4f2-4574-b73f-2995595f2199\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.104597 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx67b\" (UniqueName: \"kubernetes.io/projected/b4287b67-a4f2-4574-b73f-2995595f2199-kube-api-access-tx67b\") pod \"b4287b67-a4f2-4574-b73f-2995595f2199\" (UID: \"b4287b67-a4f2-4574-b73f-2995595f2199\") " Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.104624 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4287b67-a4f2-4574-b73f-2995595f2199-logs" (OuterVolumeSpecName: "logs") pod "b4287b67-a4f2-4574-b73f-2995595f2199" (UID: "b4287b67-a4f2-4574-b73f-2995595f2199"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.105184 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4287b67-a4f2-4574-b73f-2995595f2199-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.110599 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4287b67-a4f2-4574-b73f-2995595f2199-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b4287b67-a4f2-4574-b73f-2995595f2199" (UID: "b4287b67-a4f2-4574-b73f-2995595f2199"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.110877 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4287b67-a4f2-4574-b73f-2995595f2199-kube-api-access-tx67b" (OuterVolumeSpecName: "kube-api-access-tx67b") pod "b4287b67-a4f2-4574-b73f-2995595f2199" (UID: "b4287b67-a4f2-4574-b73f-2995595f2199"). InnerVolumeSpecName "kube-api-access-tx67b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.141376 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-config-data" (OuterVolumeSpecName: "config-data") pod "b4287b67-a4f2-4574-b73f-2995595f2199" (UID: "b4287b67-a4f2-4574-b73f-2995595f2199"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.150252 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-scripts" (OuterVolumeSpecName: "scripts") pod "b4287b67-a4f2-4574-b73f-2995595f2199" (UID: "b4287b67-a4f2-4574-b73f-2995595f2199"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.155263 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.207665 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx67b\" (UniqueName: \"kubernetes.io/projected/b4287b67-a4f2-4574-b73f-2995595f2199-kube-api-access-tx67b\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.207693 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.207705 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b4287b67-a4f2-4574-b73f-2995595f2199-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.207713 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4287b67-a4f2-4574-b73f-2995595f2199-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.293761 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.443624 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575b764d7f-w8r6x" event={"ID":"d8cf55ca-4cba-4931-a9e6-021fa6e53669","Type":"ContainerDied","Data":"0e52e9f43cc0533b58e6cb6bea646b4112e09796bece1fedf618429dc0aeaecf"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.444067 4922 scope.go:117] "RemoveContainer" containerID="430825fd8cd6cdfd8bd96757bdd143b02cd1d4b6f3b4e09afb6e2931ff2cdded" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.443828 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575b764d7f-w8r6x" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.455974 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a2a56a7e-73d3-4545-b600-abc78cc08d9b","Type":"ContainerStarted","Data":"eb22b618519ab198c69581a77ce635c81342c401c84a6d4f9d4cc02e013a7b5e"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.456018 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a2a56a7e-73d3-4545-b600-abc78cc08d9b","Type":"ContainerStarted","Data":"ea4ebf3303a3362903bc4b2e385cfeefee88616012b18e580b9454dd421ee170"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.470303 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.477207 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fc47a6e0-943d-4503-b268-0ad96699e17c","Type":"ContainerStarted","Data":"abf6023a34668252d2d64b5564f935bc74e492effc25156853d7a6c94cf5cb9f"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.487098 4922 generic.go:334] "Generic (PLEG): container finished" podID="b009e973-c6d1-4eca-a06a-ed15c5ec10ad" containerID="a82a75d99ff83b0412a0d43ae0dcc013afe24cd8e6fd2981010dea2f77a5aa5a" exitCode=0 Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.487193 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" event={"ID":"b009e973-c6d1-4eca-a06a-ed15c5ec10ad","Type":"ContainerDied","Data":"a82a75d99ff83b0412a0d43ae0dcc013afe24cd8e6fd2981010dea2f77a5aa5a"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.487219 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" event={"ID":"b009e973-c6d1-4eca-a06a-ed15c5ec10ad","Type":"ContainerStarted","Data":"1934fc6fb17670625953600e7c0a7d5d16326a8bc30ad47a4c3d9b005cc51f31"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.503335 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575b764d7f-w8r6x"] Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.510415 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-575b764d7f-w8r6x"] Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.533879 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b987c6977-c7m5n" event={"ID":"b4287b67-a4f2-4574-b73f-2995595f2199","Type":"ContainerDied","Data":"d248a6376a0dfd87676f0631402d9f0674674bd3f616c74ce20dbfd89e6839d6"} Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.534008 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b987c6977-c7m5n" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.605900 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b987c6977-c7m5n"] Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.613289 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b987c6977-c7m5n"] Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.701213 4922 scope.go:117] "RemoveContainer" containerID="fec82b2e00a08287fbc59f4fa83381598e1e396874b36f29c306c946c551a650" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.728387 4922 scope.go:117] "RemoveContainer" containerID="9fc2eafa870260ef1631cb9affb119d557ea834610e4618b01378e0c2ebe2b69" Nov 22 03:43:36 crc kubenswrapper[4922]: I1122 03:43:36.997652 4922 scope.go:117] "RemoveContainer" containerID="c0f5bf890ed771313bcf8b10a93419efecc908f9011cd5316768cd8f7d6849a7" Nov 22 03:43:37 crc kubenswrapper[4922]: I1122 03:43:37.315170 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" path="/var/lib/kubelet/pods/b4287b67-a4f2-4574-b73f-2995595f2199/volumes" Nov 22 03:43:37 crc kubenswrapper[4922]: I1122 03:43:37.316343 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" path="/var/lib/kubelet/pods/d8cf55ca-4cba-4931-a9e6-021fa6e53669/volumes" Nov 22 03:43:37 crc kubenswrapper[4922]: I1122 03:43:37.783410 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.567333 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"77912292-bd24-4c7e-ac39-015abf688ebb","Type":"ContainerStarted","Data":"a2b9244b6fe342714d52cf20842311701aa5d16a3093e99f0717bcb7f44f5d6d"} Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.574499 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a2a56a7e-73d3-4545-b600-abc78cc08d9b","Type":"ContainerStarted","Data":"74af13fc5e91e6c2861299b7cff17b95be50f2fdfd0240dd70dd5a066a087f28"} Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.574613 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.574608 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api-log" containerID="cri-o://eb22b618519ab198c69581a77ce635c81342c401c84a6d4f9d4cc02e013a7b5e" gracePeriod=30 Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.574669 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api" containerID="cri-o://74af13fc5e91e6c2861299b7cff17b95be50f2fdfd0240dd70dd5a066a087f28" gracePeriod=30 Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.581089 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fc47a6e0-943d-4503-b268-0ad96699e17c","Type":"ContainerStarted","Data":"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147"} Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.594299 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" event={"ID":"b009e973-c6d1-4eca-a06a-ed15c5ec10ad","Type":"ContainerStarted","Data":"6f1d286e570c2417ed7958ee9af7afc178aa21dfb0cd9b7c11d244337cdd477b"} Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.595330 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.595562 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.59555126 podStartE2EDuration="4.59555126s" podCreationTimestamp="2025-11-22 03:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:43:38.592831595 +0000 UTC m=+3054.631353487" watchObservedRunningTime="2025-11-22 03:43:38.59555126 +0000 UTC m=+3054.634073152" Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.619929 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" podStartSLOduration=5.619914406 podStartE2EDuration="5.619914406s" podCreationTimestamp="2025-11-22 03:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:43:38.618252725 +0000 UTC m=+3054.656774607" watchObservedRunningTime="2025-11-22 03:43:38.619914406 +0000 UTC m=+3054.658436298" Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.772374 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-89cb6b448-l5wz8" Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.853354 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d4dbc5d5b-4ppc9"] Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.856836 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon-log" containerID="cri-o://3de2ae7d25f3a188f43d026c82c62adb0e8b1e72d6c7f472ec2a56b7a417f58a" gracePeriod=30 Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.857372 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" containerID="cri-o://4c971541c2188d2f1e9d94e917b91931e33e5da1345a7f71a81e45b051b7161d" gracePeriod=30 Nov 22 03:43:38 crc kubenswrapper[4922]: I1122 03:43:38.865070 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.607248 4922 generic.go:334] "Generic (PLEG): container finished" podID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerID="74af13fc5e91e6c2861299b7cff17b95be50f2fdfd0240dd70dd5a066a087f28" exitCode=0 Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.607301 4922 generic.go:334] "Generic (PLEG): container finished" podID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerID="eb22b618519ab198c69581a77ce635c81342c401c84a6d4f9d4cc02e013a7b5e" exitCode=143 Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.607338 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a2a56a7e-73d3-4545-b600-abc78cc08d9b","Type":"ContainerDied","Data":"74af13fc5e91e6c2861299b7cff17b95be50f2fdfd0240dd70dd5a066a087f28"} Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.607503 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a2a56a7e-73d3-4545-b600-abc78cc08d9b","Type":"ContainerDied","Data":"eb22b618519ab198c69581a77ce635c81342c401c84a6d4f9d4cc02e013a7b5e"} Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.609914 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fc47a6e0-943d-4503-b268-0ad96699e17c","Type":"ContainerStarted","Data":"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528"} Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.645417 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.924301455 podStartE2EDuration="6.645391322s" podCreationTimestamp="2025-11-22 03:43:33 +0000 UTC" firstStartedPulling="2025-11-22 03:43:35.630537687 +0000 UTC m=+3051.669059579" lastFinishedPulling="2025-11-22 03:43:36.351627544 +0000 UTC m=+3052.390149446" observedRunningTime="2025-11-22 03:43:39.62821925 +0000 UTC m=+3055.666741142" watchObservedRunningTime="2025-11-22 03:43:39.645391322 +0000 UTC m=+3055.683913214" Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.845164 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996155 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996207 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-scripts\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996318 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbr8v\" (UniqueName: \"kubernetes.io/projected/a2a56a7e-73d3-4545-b600-abc78cc08d9b-kube-api-access-sbr8v\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996337 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-combined-ca-bundle\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996382 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a56a7e-73d3-4545-b600-abc78cc08d9b-etc-machine-id\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996423 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a56a7e-73d3-4545-b600-abc78cc08d9b-logs\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.996497 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data-custom\") pod \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\" (UID: \"a2a56a7e-73d3-4545-b600-abc78cc08d9b\") " Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.997767 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2a56a7e-73d3-4545-b600-abc78cc08d9b-logs" (OuterVolumeSpecName: "logs") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:43:39 crc kubenswrapper[4922]: I1122 03:43:39.998109 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2a56a7e-73d3-4545-b600-abc78cc08d9b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.004165 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-scripts" (OuterVolumeSpecName: "scripts") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.004298 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a56a7e-73d3-4545-b600-abc78cc08d9b-kube-api-access-sbr8v" (OuterVolumeSpecName: "kube-api-access-sbr8v") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "kube-api-access-sbr8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.012663 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.042013 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.063825 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data" (OuterVolumeSpecName: "config-data") pod "a2a56a7e-73d3-4545-b600-abc78cc08d9b" (UID: "a2a56a7e-73d3-4545-b600-abc78cc08d9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098540 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2a56a7e-73d3-4545-b600-abc78cc08d9b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098572 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2a56a7e-73d3-4545-b600-abc78cc08d9b-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098582 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098591 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098598 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098607 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbr8v\" (UniqueName: \"kubernetes.io/projected/a2a56a7e-73d3-4545-b600-abc78cc08d9b-kube-api-access-sbr8v\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.098618 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2a56a7e-73d3-4545-b600-abc78cc08d9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.625615 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a2a56a7e-73d3-4545-b600-abc78cc08d9b","Type":"ContainerDied","Data":"ea4ebf3303a3362903bc4b2e385cfeefee88616012b18e580b9454dd421ee170"} Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.625739 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.626013 4922 scope.go:117] "RemoveContainer" containerID="74af13fc5e91e6c2861299b7cff17b95be50f2fdfd0240dd70dd5a066a087f28" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.664331 4922 scope.go:117] "RemoveContainer" containerID="eb22b618519ab198c69581a77ce635c81342c401c84a6d4f9d4cc02e013a7b5e" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.668489 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.674779 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.706414 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:40 crc kubenswrapper[4922]: E1122 03:43:40.707259 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.707306 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon-log" Nov 22 03:43:40 crc kubenswrapper[4922]: E1122 03:43:40.707403 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.707434 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon-log" Nov 22 03:43:40 crc kubenswrapper[4922]: E1122 03:43:40.707461 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.707478 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon" Nov 22 03:43:40 crc kubenswrapper[4922]: E1122 03:43:40.707501 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.707519 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api" Nov 22 03:43:40 crc kubenswrapper[4922]: E1122 03:43:40.707555 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.707576 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon" Nov 22 03:43:40 crc kubenswrapper[4922]: E1122 03:43:40.707611 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.707628 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.708078 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.708144 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8cf55ca-4cba-4931-a9e6-021fa6e53669" containerName="horizon" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.708173 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.708206 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4287b67-a4f2-4574-b73f-2995595f2199" containerName="horizon" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.708241 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api-log" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.708298 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" containerName="manila-api" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.710686 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.715553 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.715617 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.715705 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.720723 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819426 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-internal-tls-certs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819489 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbe6f97b-70c1-4581-9367-058568f425b5-etc-machine-id\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819542 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-config-data-custom\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819574 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-scripts\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819603 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbe6f97b-70c1-4581-9367-058568f425b5-logs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819633 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqs2m\" (UniqueName: \"kubernetes.io/projected/dbe6f97b-70c1-4581-9367-058568f425b5-kube-api-access-gqs2m\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819662 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-config-data\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819829 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.819937 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-public-tls-certs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922035 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-internal-tls-certs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922101 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbe6f97b-70c1-4581-9367-058568f425b5-etc-machine-id\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922145 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-config-data-custom\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922175 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-scripts\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922203 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbe6f97b-70c1-4581-9367-058568f425b5-logs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922208 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dbe6f97b-70c1-4581-9367-058568f425b5-etc-machine-id\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922234 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqs2m\" (UniqueName: \"kubernetes.io/projected/dbe6f97b-70c1-4581-9367-058568f425b5-kube-api-access-gqs2m\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922263 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-config-data\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.922601 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbe6f97b-70c1-4581-9367-058568f425b5-logs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.923027 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.923088 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-public-tls-certs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.927222 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-config-data-custom\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.928325 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-internal-tls-certs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.930149 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.931697 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-public-tls-certs\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.931907 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-scripts\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.938761 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbe6f97b-70c1-4581-9367-058568f425b5-config-data\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:40 crc kubenswrapper[4922]: I1122 03:43:40.944316 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqs2m\" (UniqueName: \"kubernetes.io/projected/dbe6f97b-70c1-4581-9367-058568f425b5-kube-api-access-gqs2m\") pod \"manila-api-0\" (UID: \"dbe6f97b-70c1-4581-9367-058568f425b5\") " pod="openstack/manila-api-0" Nov 22 03:43:41 crc kubenswrapper[4922]: I1122 03:43:41.071929 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 03:43:41 crc kubenswrapper[4922]: I1122 03:43:41.328114 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a56a7e-73d3-4545-b600-abc78cc08d9b" path="/var/lib/kubelet/pods/a2a56a7e-73d3-4545-b600-abc78cc08d9b/volumes" Nov 22 03:43:41 crc kubenswrapper[4922]: I1122 03:43:41.644060 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 03:43:41 crc kubenswrapper[4922]: W1122 03:43:41.646967 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbe6f97b_70c1_4581_9367_058568f425b5.slice/crio-b0f97d626451464b0a71c69645024db177382dddee02083875afbd82425d6d60 WatchSource:0}: Error finding container b0f97d626451464b0a71c69645024db177382dddee02083875afbd82425d6d60: Status 404 returned error can't find the container with id b0f97d626451464b0a71c69645024db177382dddee02083875afbd82425d6d60 Nov 22 03:43:42 crc kubenswrapper[4922]: I1122 03:43:42.264220 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:47302->10.217.0.238:8443: read: connection reset by peer" Nov 22 03:43:42 crc kubenswrapper[4922]: I1122 03:43:42.655071 4922 generic.go:334] "Generic (PLEG): container finished" podID="858fddfd-d272-4323-8c51-887b9e429b56" containerID="4c971541c2188d2f1e9d94e917b91931e33e5da1345a7f71a81e45b051b7161d" exitCode=0 Nov 22 03:43:42 crc kubenswrapper[4922]: I1122 03:43:42.655134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d4dbc5d5b-4ppc9" event={"ID":"858fddfd-d272-4323-8c51-887b9e429b56","Type":"ContainerDied","Data":"4c971541c2188d2f1e9d94e917b91931e33e5da1345a7f71a81e45b051b7161d"} Nov 22 03:43:42 crc kubenswrapper[4922]: I1122 03:43:42.657071 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"dbe6f97b-70c1-4581-9367-058568f425b5","Type":"ContainerStarted","Data":"9cd5aae0ac54892bdfb6e171936e45a57a4394f4e80e7a1ffcbc217b0b941559"} Nov 22 03:43:42 crc kubenswrapper[4922]: I1122 03:43:42.657091 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"dbe6f97b-70c1-4581-9367-058568f425b5","Type":"ContainerStarted","Data":"b0f97d626451464b0a71c69645024db177382dddee02083875afbd82425d6d60"} Nov 22 03:43:43 crc kubenswrapper[4922]: I1122 03:43:43.197134 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.238:8443: connect: connection refused" Nov 22 03:43:43 crc kubenswrapper[4922]: I1122 03:43:43.886010 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 22 03:43:44 crc kubenswrapper[4922]: I1122 03:43:44.165909 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-9hspz" Nov 22 03:43:44 crc kubenswrapper[4922]: I1122 03:43:44.224362 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-hrkcn"] Nov 22 03:43:44 crc kubenswrapper[4922]: I1122 03:43:44.224588 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerName="dnsmasq-dns" containerID="cri-o://80f71273258e6772ff1eec68461ae3e742b9b0d6958dce4c4117699cc6b0d4d1" gracePeriod=10 Nov 22 03:43:44 crc kubenswrapper[4922]: I1122 03:43:44.677870 4922 generic.go:334] "Generic (PLEG): container finished" podID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerID="80f71273258e6772ff1eec68461ae3e742b9b0d6958dce4c4117699cc6b0d4d1" exitCode=0 Nov 22 03:43:44 crc kubenswrapper[4922]: I1122 03:43:44.677922 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" event={"ID":"68f450d1-62c7-4e64-b820-d4ee357f7403","Type":"ContainerDied","Data":"80f71273258e6772ff1eec68461ae3e742b9b0d6958dce4c4117699cc6b0d4d1"} Nov 22 03:43:45 crc kubenswrapper[4922]: I1122 03:43:45.871372 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.047298 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-config\") pod \"68f450d1-62c7-4e64-b820-d4ee357f7403\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.047368 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-sb\") pod \"68f450d1-62c7-4e64-b820-d4ee357f7403\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.047461 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-dns-svc\") pod \"68f450d1-62c7-4e64-b820-d4ee357f7403\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.047575 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-openstack-edpm-ipam\") pod \"68f450d1-62c7-4e64-b820-d4ee357f7403\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.047665 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxpfp\" (UniqueName: \"kubernetes.io/projected/68f450d1-62c7-4e64-b820-d4ee357f7403-kube-api-access-rxpfp\") pod \"68f450d1-62c7-4e64-b820-d4ee357f7403\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.047727 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-nb\") pod \"68f450d1-62c7-4e64-b820-d4ee357f7403\" (UID: \"68f450d1-62c7-4e64-b820-d4ee357f7403\") " Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.061501 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f450d1-62c7-4e64-b820-d4ee357f7403-kube-api-access-rxpfp" (OuterVolumeSpecName: "kube-api-access-rxpfp") pod "68f450d1-62c7-4e64-b820-d4ee357f7403" (UID: "68f450d1-62c7-4e64-b820-d4ee357f7403"). InnerVolumeSpecName "kube-api-access-rxpfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.105113 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68f450d1-62c7-4e64-b820-d4ee357f7403" (UID: "68f450d1-62c7-4e64-b820-d4ee357f7403"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.107109 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "68f450d1-62c7-4e64-b820-d4ee357f7403" (UID: "68f450d1-62c7-4e64-b820-d4ee357f7403"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.110565 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68f450d1-62c7-4e64-b820-d4ee357f7403" (UID: "68f450d1-62c7-4e64-b820-d4ee357f7403"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.114397 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-config" (OuterVolumeSpecName: "config") pod "68f450d1-62c7-4e64-b820-d4ee357f7403" (UID: "68f450d1-62c7-4e64-b820-d4ee357f7403"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.131418 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68f450d1-62c7-4e64-b820-d4ee357f7403" (UID: "68f450d1-62c7-4e64-b820-d4ee357f7403"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.152670 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.152728 4922 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-config\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.152738 4922 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.152748 4922 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.152761 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/68f450d1-62c7-4e64-b820-d4ee357f7403-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.152772 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxpfp\" (UniqueName: \"kubernetes.io/projected/68f450d1-62c7-4e64-b820-d4ee357f7403-kube-api-access-rxpfp\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.704495 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"dbe6f97b-70c1-4581-9367-058568f425b5","Type":"ContainerStarted","Data":"97ba6ab3a2b7436a9c49c07622fb92f585655fce3fbed7c2023447c734f37d1f"} Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.704980 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.708875 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" event={"ID":"68f450d1-62c7-4e64-b820-d4ee357f7403","Type":"ContainerDied","Data":"23f321a40922800280b110f5b0314b14023d5f88b5c73e390e12ce19d690dd5e"} Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.708918 4922 scope.go:117] "RemoveContainer" containerID="80f71273258e6772ff1eec68461ae3e742b9b0d6958dce4c4117699cc6b0d4d1" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.709094 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-hrkcn" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.718676 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"77912292-bd24-4c7e-ac39-015abf688ebb","Type":"ContainerStarted","Data":"d71d00492c3c5911fdb6b17fb2e154224942b97911c73815d932e81477140612"} Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.718746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"77912292-bd24-4c7e-ac39-015abf688ebb","Type":"ContainerStarted","Data":"a35847a60651184edd346d7d61ecdf84924d894c470540dcea7062773f636f6d"} Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.730088 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.730069029 podStartE2EDuration="6.730069029s" podCreationTimestamp="2025-11-22 03:43:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:43:46.723388388 +0000 UTC m=+3062.761910290" watchObservedRunningTime="2025-11-22 03:43:46.730069029 +0000 UTC m=+3062.768590921" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.752965 4922 scope.go:117] "RemoveContainer" containerID="ab24de2828458b3ffbbb704331a7d54ce33344abd83938dd745d1f6b175fe0dd" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.763556 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=6.036146411 podStartE2EDuration="13.763531462s" podCreationTimestamp="2025-11-22 03:43:33 +0000 UTC" firstStartedPulling="2025-11-22 03:43:37.829415972 +0000 UTC m=+3053.867937864" lastFinishedPulling="2025-11-22 03:43:45.556801023 +0000 UTC m=+3061.595322915" observedRunningTime="2025-11-22 03:43:46.755768016 +0000 UTC m=+3062.794289918" watchObservedRunningTime="2025-11-22 03:43:46.763531462 +0000 UTC m=+3062.802053354" Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.787693 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-hrkcn"] Nov 22 03:43:46 crc kubenswrapper[4922]: I1122 03:43:46.798080 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-hrkcn"] Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.085906 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.086230 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-central-agent" containerID="cri-o://870a4b67e520d473b30a9deab7fb66faf7e9d012a26d9e12b2a023d47f6744fe" gracePeriod=30 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.086746 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="proxy-httpd" containerID="cri-o://430ff11b99526bfac61a4ae87d2e4c9bfbc3b703140f4279d0b366b457e90a7c" gracePeriod=30 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.086806 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="sg-core" containerID="cri-o://80f3c6edbcae8cec2067b4a77ed102ed12782e11395cb883e59520eebfb0c11a" gracePeriod=30 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.086879 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-notification-agent" containerID="cri-o://e87fb80552e8c9f71547693bedb02798249bc7d23a2b9626811a3f726d469327" gracePeriod=30 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.313438 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" path="/var/lib/kubelet/pods/68f450d1-62c7-4e64-b820-d4ee357f7403/volumes" Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.731859 4922 generic.go:334] "Generic (PLEG): container finished" podID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerID="430ff11b99526bfac61a4ae87d2e4c9bfbc3b703140f4279d0b366b457e90a7c" exitCode=0 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.731895 4922 generic.go:334] "Generic (PLEG): container finished" podID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerID="80f3c6edbcae8cec2067b4a77ed102ed12782e11395cb883e59520eebfb0c11a" exitCode=2 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.731907 4922 generic.go:334] "Generic (PLEG): container finished" podID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerID="870a4b67e520d473b30a9deab7fb66faf7e9d012a26d9e12b2a023d47f6744fe" exitCode=0 Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.731976 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerDied","Data":"430ff11b99526bfac61a4ae87d2e4c9bfbc3b703140f4279d0b366b457e90a7c"} Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.732009 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerDied","Data":"80f3c6edbcae8cec2067b4a77ed102ed12782e11395cb883e59520eebfb0c11a"} Nov 22 03:43:47 crc kubenswrapper[4922]: I1122 03:43:47.732024 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerDied","Data":"870a4b67e520d473b30a9deab7fb66faf7e9d012a26d9e12b2a023d47f6744fe"} Nov 22 03:43:50 crc kubenswrapper[4922]: I1122 03:43:50.302246 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:43:50 crc kubenswrapper[4922]: E1122 03:43:50.303476 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:43:53 crc kubenswrapper[4922]: I1122 03:43:53.199056 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.238:8443: connect: connection refused" Nov 22 03:43:53 crc kubenswrapper[4922]: I1122 03:43:53.805200 4922 generic.go:334] "Generic (PLEG): container finished" podID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerID="e87fb80552e8c9f71547693bedb02798249bc7d23a2b9626811a3f726d469327" exitCode=0 Nov 22 03:43:53 crc kubenswrapper[4922]: I1122 03:43:53.805311 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerDied","Data":"e87fb80552e8c9f71547693bedb02798249bc7d23a2b9626811a3f726d469327"} Nov 22 03:43:53 crc kubenswrapper[4922]: I1122 03:43:53.998622 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.131773 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-combined-ca-bundle\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.131906 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-sg-core-conf-yaml\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.131959 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-ceilometer-tls-certs\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.131990 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-log-httpd\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.132066 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-config-data\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.132107 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-scripts\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.132138 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-run-httpd\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.132314 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85tz5\" (UniqueName: \"kubernetes.io/projected/6f124e2e-3c52-46dc-8bb4-e931372a83eb-kube-api-access-85tz5\") pod \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\" (UID: \"6f124e2e-3c52-46dc-8bb4-e931372a83eb\") " Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.133127 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.134242 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.139092 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f124e2e-3c52-46dc-8bb4-e931372a83eb-kube-api-access-85tz5" (OuterVolumeSpecName: "kube-api-access-85tz5") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "kube-api-access-85tz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.139648 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-scripts" (OuterVolumeSpecName: "scripts") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.187218 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.235324 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85tz5\" (UniqueName: \"kubernetes.io/projected/6f124e2e-3c52-46dc-8bb4-e931372a83eb-kube-api-access-85tz5\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.235366 4922 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.235384 4922 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.235399 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.235415 4922 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f124e2e-3c52-46dc-8bb4-e931372a83eb-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.238441 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.245805 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.250079 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.280420 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-config-data" (OuterVolumeSpecName: "config-data") pod "6f124e2e-3c52-46dc-8bb4-e931372a83eb" (UID: "6f124e2e-3c52-46dc-8bb4-e931372a83eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.336879 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.336913 4922 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.336950 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f124e2e-3c52-46dc-8bb4-e931372a83eb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.840237 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f124e2e-3c52-46dc-8bb4-e931372a83eb","Type":"ContainerDied","Data":"13a643d27145d35d6a11635063c639af3ae4e6a87ed2396a1306fc643e618a84"} Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.840290 4922 scope.go:117] "RemoveContainer" containerID="430ff11b99526bfac61a4ae87d2e4c9bfbc3b703140f4279d0b366b457e90a7c" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.840395 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.896089 4922 scope.go:117] "RemoveContainer" containerID="80f3c6edbcae8cec2067b4a77ed102ed12782e11395cb883e59520eebfb0c11a" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.903245 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.925566 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.935919 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:43:54 crc kubenswrapper[4922]: E1122 03:43:54.936345 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-central-agent" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936365 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-central-agent" Nov 22 03:43:54 crc kubenswrapper[4922]: E1122 03:43:54.936375 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-notification-agent" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936382 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-notification-agent" Nov 22 03:43:54 crc kubenswrapper[4922]: E1122 03:43:54.936392 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="sg-core" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936398 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="sg-core" Nov 22 03:43:54 crc kubenswrapper[4922]: E1122 03:43:54.936419 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="proxy-httpd" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936424 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="proxy-httpd" Nov 22 03:43:54 crc kubenswrapper[4922]: E1122 03:43:54.936435 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerName="dnsmasq-dns" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936444 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerName="dnsmasq-dns" Nov 22 03:43:54 crc kubenswrapper[4922]: E1122 03:43:54.936455 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerName="init" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936460 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerName="init" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936673 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f450d1-62c7-4e64-b820-d4ee357f7403" containerName="dnsmasq-dns" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936692 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="sg-core" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936702 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-central-agent" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936709 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="ceilometer-notification-agent" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.936726 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" containerName="proxy-httpd" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.938648 4922 scope.go:117] "RemoveContainer" containerID="e87fb80552e8c9f71547693bedb02798249bc7d23a2b9626811a3f726d469327" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.938776 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.943694 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.944079 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.944386 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.944755 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.966709 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-scripts\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.966775 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.966814 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085686e3-eda2-407d-8131-076777ea14af-log-httpd\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.966839 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-config-data\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.966908 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.966991 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.967023 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn6lb\" (UniqueName: \"kubernetes.io/projected/085686e3-eda2-407d-8131-076777ea14af-kube-api-access-pn6lb\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.967098 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085686e3-eda2-407d-8131-076777ea14af-run-httpd\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:54 crc kubenswrapper[4922]: I1122 03:43:54.980194 4922 scope.go:117] "RemoveContainer" containerID="870a4b67e520d473b30a9deab7fb66faf7e9d012a26d9e12b2a023d47f6744fe" Nov 22 03:43:55 crc kubenswrapper[4922]: E1122 03:43:55.020450 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f124e2e_3c52_46dc_8bb4_e931372a83eb.slice/crio-13a643d27145d35d6a11635063c639af3ae4e6a87ed2396a1306fc643e618a84\": RecentStats: unable to find data in memory cache]" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.068690 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn6lb\" (UniqueName: \"kubernetes.io/projected/085686e3-eda2-407d-8131-076777ea14af-kube-api-access-pn6lb\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069177 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085686e3-eda2-407d-8131-076777ea14af-run-httpd\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069247 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-scripts\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069278 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069302 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085686e3-eda2-407d-8131-076777ea14af-log-httpd\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069329 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-config-data\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069360 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.069649 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085686e3-eda2-407d-8131-076777ea14af-run-httpd\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.070244 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085686e3-eda2-407d-8131-076777ea14af-log-httpd\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.074597 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.075247 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.075621 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-scripts\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.075903 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-config-data\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.085768 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/085686e3-eda2-407d-8131-076777ea14af-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.103620 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn6lb\" (UniqueName: \"kubernetes.io/projected/085686e3-eda2-407d-8131-076777ea14af-kube-api-access-pn6lb\") pod \"ceilometer-0\" (UID: \"085686e3-eda2-407d-8131-076777ea14af\") " pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.278479 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.312734 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f124e2e-3c52-46dc-8bb4-e931372a83eb" path="/var/lib/kubelet/pods/6f124e2e-3c52-46dc-8bb4-e931372a83eb/volumes" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.340741 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.444572 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.807152 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 03:43:55 crc kubenswrapper[4922]: W1122 03:43:55.814950 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod085686e3_eda2_407d_8131_076777ea14af.slice/crio-aad95104fec332ce22e24c2325bd20d8a90f28b9baa46e2b92b41736e458601c WatchSource:0}: Error finding container aad95104fec332ce22e24c2325bd20d8a90f28b9baa46e2b92b41736e458601c: Status 404 returned error can't find the container with id aad95104fec332ce22e24c2325bd20d8a90f28b9baa46e2b92b41736e458601c Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.857543 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085686e3-eda2-407d-8131-076777ea14af","Type":"ContainerStarted","Data":"aad95104fec332ce22e24c2325bd20d8a90f28b9baa46e2b92b41736e458601c"} Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.861895 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="manila-scheduler" containerID="cri-o://0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147" gracePeriod=30 Nov 22 03:43:55 crc kubenswrapper[4922]: I1122 03:43:55.861922 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="probe" containerID="cri-o://a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528" gracePeriod=30 Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.850616 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876295 4922 generic.go:334] "Generic (PLEG): container finished" podID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerID="a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528" exitCode=0 Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876349 4922 generic.go:334] "Generic (PLEG): container finished" podID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerID="0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147" exitCode=0 Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876378 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fc47a6e0-943d-4503-b268-0ad96699e17c","Type":"ContainerDied","Data":"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528"} Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876428 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fc47a6e0-943d-4503-b268-0ad96699e17c","Type":"ContainerDied","Data":"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147"} Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876443 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fc47a6e0-943d-4503-b268-0ad96699e17c","Type":"ContainerDied","Data":"abf6023a34668252d2d64b5564f935bc74e492effc25156853d7a6c94cf5cb9f"} Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876460 4922 scope.go:117] "RemoveContainer" containerID="a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.876626 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.904254 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-combined-ca-bundle\") pod \"fc47a6e0-943d-4503-b268-0ad96699e17c\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.904393 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data-custom\") pod \"fc47a6e0-943d-4503-b268-0ad96699e17c\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.904422 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data\") pod \"fc47a6e0-943d-4503-b268-0ad96699e17c\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.904471 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8cgv\" (UniqueName: \"kubernetes.io/projected/fc47a6e0-943d-4503-b268-0ad96699e17c-kube-api-access-m8cgv\") pod \"fc47a6e0-943d-4503-b268-0ad96699e17c\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.904526 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-scripts\") pod \"fc47a6e0-943d-4503-b268-0ad96699e17c\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.904562 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc47a6e0-943d-4503-b268-0ad96699e17c-etc-machine-id\") pod \"fc47a6e0-943d-4503-b268-0ad96699e17c\" (UID: \"fc47a6e0-943d-4503-b268-0ad96699e17c\") " Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.905067 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc47a6e0-943d-4503-b268-0ad96699e17c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc47a6e0-943d-4503-b268-0ad96699e17c" (UID: "fc47a6e0-943d-4503-b268-0ad96699e17c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.909174 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc47a6e0-943d-4503-b268-0ad96699e17c-kube-api-access-m8cgv" (OuterVolumeSpecName: "kube-api-access-m8cgv") pod "fc47a6e0-943d-4503-b268-0ad96699e17c" (UID: "fc47a6e0-943d-4503-b268-0ad96699e17c"). InnerVolumeSpecName "kube-api-access-m8cgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.911142 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-scripts" (OuterVolumeSpecName: "scripts") pod "fc47a6e0-943d-4503-b268-0ad96699e17c" (UID: "fc47a6e0-943d-4503-b268-0ad96699e17c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.922895 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc47a6e0-943d-4503-b268-0ad96699e17c" (UID: "fc47a6e0-943d-4503-b268-0ad96699e17c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.929553 4922 scope.go:117] "RemoveContainer" containerID="0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147" Nov 22 03:43:56 crc kubenswrapper[4922]: I1122 03:43:56.967359 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc47a6e0-943d-4503-b268-0ad96699e17c" (UID: "fc47a6e0-943d-4503-b268-0ad96699e17c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.007792 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.007905 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc47a6e0-943d-4503-b268-0ad96699e17c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.007919 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.007931 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.007943 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8cgv\" (UniqueName: \"kubernetes.io/projected/fc47a6e0-943d-4503-b268-0ad96699e17c-kube-api-access-m8cgv\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.018420 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data" (OuterVolumeSpecName: "config-data") pod "fc47a6e0-943d-4503-b268-0ad96699e17c" (UID: "fc47a6e0-943d-4503-b268-0ad96699e17c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.104015 4922 scope.go:117] "RemoveContainer" containerID="a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528" Nov 22 03:43:57 crc kubenswrapper[4922]: E1122 03:43:57.104654 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528\": container with ID starting with a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528 not found: ID does not exist" containerID="a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.104682 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528"} err="failed to get container status \"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528\": rpc error: code = NotFound desc = could not find container \"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528\": container with ID starting with a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528 not found: ID does not exist" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.104702 4922 scope.go:117] "RemoveContainer" containerID="0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147" Nov 22 03:43:57 crc kubenswrapper[4922]: E1122 03:43:57.105085 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147\": container with ID starting with 0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147 not found: ID does not exist" containerID="0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.105105 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147"} err="failed to get container status \"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147\": rpc error: code = NotFound desc = could not find container \"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147\": container with ID starting with 0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147 not found: ID does not exist" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.105117 4922 scope.go:117] "RemoveContainer" containerID="a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.105461 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528"} err="failed to get container status \"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528\": rpc error: code = NotFound desc = could not find container \"a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528\": container with ID starting with a8dbfe1ed794691243dc16c8b3dce5fe48207adac4d1fce7aafac70b6185b528 not found: ID does not exist" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.105501 4922 scope.go:117] "RemoveContainer" containerID="0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.105812 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147"} err="failed to get container status \"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147\": rpc error: code = NotFound desc = could not find container \"0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147\": container with ID starting with 0ab6c0650b609bd4763dd96356159b183305c24f610037c1ac486a85f70b5147 not found: ID does not exist" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.110083 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47a6e0-943d-4503-b268-0ad96699e17c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.214677 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.227320 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.247810 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:57 crc kubenswrapper[4922]: E1122 03:43:57.248227 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="probe" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.248247 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="probe" Nov 22 03:43:57 crc kubenswrapper[4922]: E1122 03:43:57.248280 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="manila-scheduler" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.248287 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="manila-scheduler" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.248457 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="probe" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.248480 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" containerName="manila-scheduler" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.249535 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.252180 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.269266 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.314547 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.314676 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21de88bc-c66e-4f93-afbd-b9354b1d7857-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.314730 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-config-data\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.314756 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.314804 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfhh\" (UniqueName: \"kubernetes.io/projected/21de88bc-c66e-4f93-afbd-b9354b1d7857-kube-api-access-4gfhh\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.315045 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-scripts\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.321915 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc47a6e0-943d-4503-b268-0ad96699e17c" path="/var/lib/kubelet/pods/fc47a6e0-943d-4503-b268-0ad96699e17c/volumes" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.416992 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfhh\" (UniqueName: \"kubernetes.io/projected/21de88bc-c66e-4f93-afbd-b9354b1d7857-kube-api-access-4gfhh\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.417337 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-scripts\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.417444 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.417622 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21de88bc-c66e-4f93-afbd-b9354b1d7857-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.417705 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-config-data\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.417770 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.418128 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21de88bc-c66e-4f93-afbd-b9354b1d7857-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.422594 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.423639 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-config-data\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.437598 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfhh\" (UniqueName: \"kubernetes.io/projected/21de88bc-c66e-4f93-afbd-b9354b1d7857-kube-api-access-4gfhh\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.438338 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-scripts\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.448425 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21de88bc-c66e-4f93-afbd-b9354b1d7857-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"21de88bc-c66e-4f93-afbd-b9354b1d7857\") " pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.579669 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 03:43:57 crc kubenswrapper[4922]: I1122 03:43:57.890381 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085686e3-eda2-407d-8131-076777ea14af","Type":"ContainerStarted","Data":"30f00a1e18887c0446d098cf8ec9cc784e99ded6814c5607b3d59a1a6ef370fb"} Nov 22 03:43:58 crc kubenswrapper[4922]: I1122 03:43:58.034695 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 03:43:58 crc kubenswrapper[4922]: W1122 03:43:58.050754 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21de88bc_c66e_4f93_afbd_b9354b1d7857.slice/crio-f2aca092b424476a384fbd45014a3d87508861f6345b2c3c4b50ff264e7c1e53 WatchSource:0}: Error finding container f2aca092b424476a384fbd45014a3d87508861f6345b2c3c4b50ff264e7c1e53: Status 404 returned error can't find the container with id f2aca092b424476a384fbd45014a3d87508861f6345b2c3c4b50ff264e7c1e53 Nov 22 03:43:58 crc kubenswrapper[4922]: I1122 03:43:58.908279 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"21de88bc-c66e-4f93-afbd-b9354b1d7857","Type":"ContainerStarted","Data":"52470557b9609fa07513c8c217be660782b6dd9fe2db3a24b54c336939e24936"} Nov 22 03:43:58 crc kubenswrapper[4922]: I1122 03:43:58.908785 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"21de88bc-c66e-4f93-afbd-b9354b1d7857","Type":"ContainerStarted","Data":"f2aca092b424476a384fbd45014a3d87508861f6345b2c3c4b50ff264e7c1e53"} Nov 22 03:43:58 crc kubenswrapper[4922]: I1122 03:43:58.912789 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085686e3-eda2-407d-8131-076777ea14af","Type":"ContainerStarted","Data":"fcb3e860a1d9b312157085ec1029459307c328d7e6871d7522215cfabbabc34e"} Nov 22 03:43:58 crc kubenswrapper[4922]: I1122 03:43:58.912836 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085686e3-eda2-407d-8131-076777ea14af","Type":"ContainerStarted","Data":"0cbd8c3b3a318156526a7a5005b57acfe9abeed38d7f08c54aaf34241504e276"} Nov 22 03:43:59 crc kubenswrapper[4922]: I1122 03:43:59.936805 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"21de88bc-c66e-4f93-afbd-b9354b1d7857","Type":"ContainerStarted","Data":"8f4db8fddf24c49a94ffa41684e7f01b1aca17c62212adfcc6ee9e716e0de9e2"} Nov 22 03:43:59 crc kubenswrapper[4922]: I1122 03:43:59.964248 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.964231343 podStartE2EDuration="2.964231343s" podCreationTimestamp="2025-11-22 03:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:43:59.954182432 +0000 UTC m=+3075.992704334" watchObservedRunningTime="2025-11-22 03:43:59.964231343 +0000 UTC m=+3076.002753235" Nov 22 03:44:00 crc kubenswrapper[4922]: I1122 03:44:00.955063 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085686e3-eda2-407d-8131-076777ea14af","Type":"ContainerStarted","Data":"eac6c84dbad460f0e7241d481f11e163cc303fc5a59ab6cf00ab5b7b586b5461"} Nov 22 03:44:00 crc kubenswrapper[4922]: I1122 03:44:00.996189 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.843339356 podStartE2EDuration="6.996140524s" podCreationTimestamp="2025-11-22 03:43:54 +0000 UTC" firstStartedPulling="2025-11-22 03:43:55.817348037 +0000 UTC m=+3071.855869929" lastFinishedPulling="2025-11-22 03:43:59.970149175 +0000 UTC m=+3076.008671097" observedRunningTime="2025-11-22 03:44:00.990517669 +0000 UTC m=+3077.029039651" watchObservedRunningTime="2025-11-22 03:44:00.996140524 +0000 UTC m=+3077.034662446" Nov 22 03:44:01 crc kubenswrapper[4922]: I1122 03:44:01.966747 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 03:44:02 crc kubenswrapper[4922]: I1122 03:44:02.309899 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 22 03:44:03 crc kubenswrapper[4922]: I1122 03:44:03.197585 4922 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d4dbc5d5b-4ppc9" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.238:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.238:8443: connect: connection refused" Nov 22 03:44:05 crc kubenswrapper[4922]: I1122 03:44:05.307269 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:44:05 crc kubenswrapper[4922]: E1122 03:44:05.307865 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:44:05 crc kubenswrapper[4922]: I1122 03:44:05.713448 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 22 03:44:05 crc kubenswrapper[4922]: I1122 03:44:05.779253 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:44:06 crc kubenswrapper[4922]: I1122 03:44:06.006349 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="manila-share" containerID="cri-o://a35847a60651184edd346d7d61ecdf84924d894c470540dcea7062773f636f6d" gracePeriod=30 Nov 22 03:44:06 crc kubenswrapper[4922]: I1122 03:44:06.006469 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="probe" containerID="cri-o://d71d00492c3c5911fdb6b17fb2e154224942b97911c73815d932e81477140612" gracePeriod=30 Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.021131 4922 generic.go:334] "Generic (PLEG): container finished" podID="77912292-bd24-4c7e-ac39-015abf688ebb" containerID="d71d00492c3c5911fdb6b17fb2e154224942b97911c73815d932e81477140612" exitCode=0 Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.021446 4922 generic.go:334] "Generic (PLEG): container finished" podID="77912292-bd24-4c7e-ac39-015abf688ebb" containerID="a35847a60651184edd346d7d61ecdf84924d894c470540dcea7062773f636f6d" exitCode=1 Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.021475 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"77912292-bd24-4c7e-ac39-015abf688ebb","Type":"ContainerDied","Data":"d71d00492c3c5911fdb6b17fb2e154224942b97911c73815d932e81477140612"} Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.021510 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"77912292-bd24-4c7e-ac39-015abf688ebb","Type":"ContainerDied","Data":"a35847a60651184edd346d7d61ecdf84924d894c470540dcea7062773f636f6d"} Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.351529 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.448126 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-combined-ca-bundle\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449043 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-ceph\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449197 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data-custom\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449223 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-etc-machine-id\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449243 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449283 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-var-lib-manila\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449428 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chkjk\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-kube-api-access-chkjk\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449457 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449479 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-scripts\") pod \"77912292-bd24-4c7e-ac39-015abf688ebb\" (UID: \"77912292-bd24-4c7e-ac39-015abf688ebb\") " Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.449594 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.450088 4922 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-var-lib-manila\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.450107 4922 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77912292-bd24-4c7e-ac39-015abf688ebb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.459803 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.460369 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-kube-api-access-chkjk" (OuterVolumeSpecName: "kube-api-access-chkjk") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "kube-api-access-chkjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.461745 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-ceph" (OuterVolumeSpecName: "ceph") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.462132 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-scripts" (OuterVolumeSpecName: "scripts") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.503550 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.551862 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.551891 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.551903 4922 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.551916 4922 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.551927 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chkjk\" (UniqueName: \"kubernetes.io/projected/77912292-bd24-4c7e-ac39-015abf688ebb-kube-api-access-chkjk\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.561452 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data" (OuterVolumeSpecName: "config-data") pod "77912292-bd24-4c7e-ac39-015abf688ebb" (UID: "77912292-bd24-4c7e-ac39-015abf688ebb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.579868 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 22 03:44:07 crc kubenswrapper[4922]: I1122 03:44:07.654280 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77912292-bd24-4c7e-ac39-015abf688ebb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.031541 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"77912292-bd24-4c7e-ac39-015abf688ebb","Type":"ContainerDied","Data":"a2b9244b6fe342714d52cf20842311701aa5d16a3093e99f0717bcb7f44f5d6d"} Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.031582 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.031599 4922 scope.go:117] "RemoveContainer" containerID="d71d00492c3c5911fdb6b17fb2e154224942b97911c73815d932e81477140612" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.056951 4922 scope.go:117] "RemoveContainer" containerID="a35847a60651184edd346d7d61ecdf84924d894c470540dcea7062773f636f6d" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.063226 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.070538 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.093593 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:44:08 crc kubenswrapper[4922]: E1122 03:44:08.094049 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="probe" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.094067 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="probe" Nov 22 03:44:08 crc kubenswrapper[4922]: E1122 03:44:08.094101 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="manila-share" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.094108 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="manila-share" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.094277 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="probe" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.094297 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" containerName="manila-share" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.095300 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.097352 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.103504 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161653 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2m9h\" (UniqueName: \"kubernetes.io/projected/28375d16-6e26-4490-a1a5-e90290f09e19-kube-api-access-s2m9h\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161729 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-scripts\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161778 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/28375d16-6e26-4490-a1a5-e90290f09e19-ceph\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161797 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28375d16-6e26-4490-a1a5-e90290f09e19-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161819 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/28375d16-6e26-4490-a1a5-e90290f09e19-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161845 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-config-data\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161905 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.161944 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263395 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2m9h\" (UniqueName: \"kubernetes.io/projected/28375d16-6e26-4490-a1a5-e90290f09e19-kube-api-access-s2m9h\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263497 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-scripts\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263548 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/28375d16-6e26-4490-a1a5-e90290f09e19-ceph\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263569 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28375d16-6e26-4490-a1a5-e90290f09e19-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263593 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/28375d16-6e26-4490-a1a5-e90290f09e19-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263609 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-config-data\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263639 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.263682 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.264486 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/28375d16-6e26-4490-a1a5-e90290f09e19-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.264822 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/28375d16-6e26-4490-a1a5-e90290f09e19-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.269144 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.270011 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/28375d16-6e26-4490-a1a5-e90290f09e19-ceph\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.270144 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-config-data\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.272534 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-scripts\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.276501 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28375d16-6e26-4490-a1a5-e90290f09e19-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.287379 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2m9h\" (UniqueName: \"kubernetes.io/projected/28375d16-6e26-4490-a1a5-e90290f09e19-kube-api-access-s2m9h\") pod \"manila-share-share1-0\" (UID: \"28375d16-6e26-4490-a1a5-e90290f09e19\") " pod="openstack/manila-share-share1-0" Nov 22 03:44:08 crc kubenswrapper[4922]: I1122 03:44:08.415285 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.008529 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.042603 4922 generic.go:334] "Generic (PLEG): container finished" podID="858fddfd-d272-4323-8c51-887b9e429b56" containerID="3de2ae7d25f3a188f43d026c82c62adb0e8b1e72d6c7f472ec2a56b7a417f58a" exitCode=137 Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.042672 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d4dbc5d5b-4ppc9" event={"ID":"858fddfd-d272-4323-8c51-887b9e429b56","Type":"ContainerDied","Data":"3de2ae7d25f3a188f43d026c82c62adb0e8b1e72d6c7f472ec2a56b7a417f58a"} Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.046894 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"28375d16-6e26-4490-a1a5-e90290f09e19","Type":"ContainerStarted","Data":"cdd38095f69a9d95f9f48311c5b93b77d5e030e6ff6f190d6f3bf82021d48831"} Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.201914 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287480 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chqmx\" (UniqueName: \"kubernetes.io/projected/858fddfd-d272-4323-8c51-887b9e429b56-kube-api-access-chqmx\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287629 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-tls-certs\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287664 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-config-data\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287693 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-scripts\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287750 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858fddfd-d272-4323-8c51-887b9e429b56-logs\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287799 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-secret-key\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.287872 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-combined-ca-bundle\") pod \"858fddfd-d272-4323-8c51-887b9e429b56\" (UID: \"858fddfd-d272-4323-8c51-887b9e429b56\") " Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.288655 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858fddfd-d272-4323-8c51-887b9e429b56-logs" (OuterVolumeSpecName: "logs") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.291479 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.291533 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858fddfd-d272-4323-8c51-887b9e429b56-kube-api-access-chqmx" (OuterVolumeSpecName: "kube-api-access-chqmx") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "kube-api-access-chqmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.310498 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-scripts" (OuterVolumeSpecName: "scripts") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.312655 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-config-data" (OuterVolumeSpecName: "config-data") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.315297 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77912292-bd24-4c7e-ac39-015abf688ebb" path="/var/lib/kubelet/pods/77912292-bd24-4c7e-ac39-015abf688ebb/volumes" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.320361 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.333435 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "858fddfd-d272-4323-8c51-887b9e429b56" (UID: "858fddfd-d272-4323-8c51-887b9e429b56"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392378 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392413 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392426 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chqmx\" (UniqueName: \"kubernetes.io/projected/858fddfd-d272-4323-8c51-887b9e429b56-kube-api-access-chqmx\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392439 4922 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/858fddfd-d272-4323-8c51-887b9e429b56-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392451 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392463 4922 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/858fddfd-d272-4323-8c51-887b9e429b56-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:09 crc kubenswrapper[4922]: I1122 03:44:09.392473 4922 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/858fddfd-d272-4323-8c51-887b9e429b56-logs\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.055829 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"28375d16-6e26-4490-a1a5-e90290f09e19","Type":"ContainerStarted","Data":"e1ac6e689523051ad99da6a34fd66ba7facfba09cdbebe34c562fbce0bdd5b7a"} Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.057022 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"28375d16-6e26-4490-a1a5-e90290f09e19","Type":"ContainerStarted","Data":"4b39233a707c7c8ddea78b44ba02a4d8caac387b92603e4dbfba0a7894016044"} Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.060085 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d4dbc5d5b-4ppc9" event={"ID":"858fddfd-d272-4323-8c51-887b9e429b56","Type":"ContainerDied","Data":"b352ffe10e5e8f5f403fc935e016af5db55ec547fafe1ac0fad8bcda8a52d60a"} Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.060270 4922 scope.go:117] "RemoveContainer" containerID="4c971541c2188d2f1e9d94e917b91931e33e5da1345a7f71a81e45b051b7161d" Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.060241 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d4dbc5d5b-4ppc9" Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.084384 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.084365604 podStartE2EDuration="2.084365604s" podCreationTimestamp="2025-11-22 03:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 03:44:10.084323324 +0000 UTC m=+3086.122845216" watchObservedRunningTime="2025-11-22 03:44:10.084365604 +0000 UTC m=+3086.122887496" Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.107828 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d4dbc5d5b-4ppc9"] Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.116697 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d4dbc5d5b-4ppc9"] Nov 22 03:44:10 crc kubenswrapper[4922]: I1122 03:44:10.252947 4922 scope.go:117] "RemoveContainer" containerID="3de2ae7d25f3a188f43d026c82c62adb0e8b1e72d6c7f472ec2a56b7a417f58a" Nov 22 03:44:11 crc kubenswrapper[4922]: I1122 03:44:11.323132 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858fddfd-d272-4323-8c51-887b9e429b56" path="/var/lib/kubelet/pods/858fddfd-d272-4323-8c51-887b9e429b56/volumes" Nov 22 03:44:18 crc kubenswrapper[4922]: I1122 03:44:18.416448 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 22 03:44:18 crc kubenswrapper[4922]: I1122 03:44:18.975238 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.186631 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8548"] Nov 22 03:44:19 crc kubenswrapper[4922]: E1122 03:44:19.187105 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.187120 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" Nov 22 03:44:19 crc kubenswrapper[4922]: E1122 03:44:19.187147 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon-log" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.187155 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon-log" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.187329 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.187346 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="858fddfd-d272-4323-8c51-887b9e429b56" containerName="horizon-log" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.188708 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.205918 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8548"] Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.294481 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqd78\" (UniqueName: \"kubernetes.io/projected/ed40e0c2-2091-4474-8e58-733bbb5cb740-kube-api-access-xqd78\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.295261 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-utilities\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.295698 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-catalog-content\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.397953 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqd78\" (UniqueName: \"kubernetes.io/projected/ed40e0c2-2091-4474-8e58-733bbb5cb740-kube-api-access-xqd78\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.398021 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-utilities\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.398700 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-catalog-content\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.398891 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-utilities\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.399313 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-catalog-content\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.424109 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqd78\" (UniqueName: \"kubernetes.io/projected/ed40e0c2-2091-4474-8e58-733bbb5cb740-kube-api-access-xqd78\") pod \"certified-operators-r8548\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:19 crc kubenswrapper[4922]: I1122 03:44:19.525155 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:20 crc kubenswrapper[4922]: I1122 03:44:20.025220 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8548"] Nov 22 03:44:20 crc kubenswrapper[4922]: W1122 03:44:20.027818 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded40e0c2_2091_4474_8e58_733bbb5cb740.slice/crio-279676d0a1955d3ea29d09e97ac6d68ad3fcd00172ad7bbc6e4ca5afec73f905 WatchSource:0}: Error finding container 279676d0a1955d3ea29d09e97ac6d68ad3fcd00172ad7bbc6e4ca5afec73f905: Status 404 returned error can't find the container with id 279676d0a1955d3ea29d09e97ac6d68ad3fcd00172ad7bbc6e4ca5afec73f905 Nov 22 03:44:20 crc kubenswrapper[4922]: I1122 03:44:20.164850 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerStarted","Data":"279676d0a1955d3ea29d09e97ac6d68ad3fcd00172ad7bbc6e4ca5afec73f905"} Nov 22 03:44:20 crc kubenswrapper[4922]: I1122 03:44:20.301236 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:44:20 crc kubenswrapper[4922]: E1122 03:44:20.301530 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:44:21 crc kubenswrapper[4922]: I1122 03:44:21.187208 4922 generic.go:334] "Generic (PLEG): container finished" podID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerID="90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb" exitCode=0 Nov 22 03:44:21 crc kubenswrapper[4922]: I1122 03:44:21.187271 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerDied","Data":"90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb"} Nov 22 03:44:21 crc kubenswrapper[4922]: I1122 03:44:21.190113 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:44:22 crc kubenswrapper[4922]: I1122 03:44:22.201520 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerStarted","Data":"eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166"} Nov 22 03:44:23 crc kubenswrapper[4922]: I1122 03:44:23.220324 4922 generic.go:334] "Generic (PLEG): container finished" podID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerID="eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166" exitCode=0 Nov 22 03:44:23 crc kubenswrapper[4922]: I1122 03:44:23.220474 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerDied","Data":"eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166"} Nov 22 03:44:24 crc kubenswrapper[4922]: I1122 03:44:24.233233 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerStarted","Data":"5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6"} Nov 22 03:44:24 crc kubenswrapper[4922]: I1122 03:44:24.254947 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8548" podStartSLOduration=2.834986013 podStartE2EDuration="5.254924166s" podCreationTimestamp="2025-11-22 03:44:19 +0000 UTC" firstStartedPulling="2025-11-22 03:44:21.189799108 +0000 UTC m=+3097.228321000" lastFinishedPulling="2025-11-22 03:44:23.609737231 +0000 UTC m=+3099.648259153" observedRunningTime="2025-11-22 03:44:24.252499487 +0000 UTC m=+3100.291021399" watchObservedRunningTime="2025-11-22 03:44:24.254924166 +0000 UTC m=+3100.293446068" Nov 22 03:44:25 crc kubenswrapper[4922]: I1122 03:44:25.293616 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 03:44:29 crc kubenswrapper[4922]: I1122 03:44:29.525390 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:29 crc kubenswrapper[4922]: I1122 03:44:29.526174 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:29 crc kubenswrapper[4922]: I1122 03:44:29.595044 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:29 crc kubenswrapper[4922]: I1122 03:44:29.907621 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 22 03:44:30 crc kubenswrapper[4922]: I1122 03:44:30.343440 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:30 crc kubenswrapper[4922]: I1122 03:44:30.404563 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8548"] Nov 22 03:44:32 crc kubenswrapper[4922]: I1122 03:44:32.322138 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8548" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="registry-server" containerID="cri-o://5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6" gracePeriod=2 Nov 22 03:44:32 crc kubenswrapper[4922]: I1122 03:44:32.828543 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:32 crc kubenswrapper[4922]: I1122 03:44:32.998394 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-utilities\") pod \"ed40e0c2-2091-4474-8e58-733bbb5cb740\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " Nov 22 03:44:32 crc kubenswrapper[4922]: I1122 03:44:32.998930 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqd78\" (UniqueName: \"kubernetes.io/projected/ed40e0c2-2091-4474-8e58-733bbb5cb740-kube-api-access-xqd78\") pod \"ed40e0c2-2091-4474-8e58-733bbb5cb740\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " Nov 22 03:44:32 crc kubenswrapper[4922]: I1122 03:44:32.998994 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-catalog-content\") pod \"ed40e0c2-2091-4474-8e58-733bbb5cb740\" (UID: \"ed40e0c2-2091-4474-8e58-733bbb5cb740\") " Nov 22 03:44:32 crc kubenswrapper[4922]: I1122 03:44:32.999594 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-utilities" (OuterVolumeSpecName: "utilities") pod "ed40e0c2-2091-4474-8e58-733bbb5cb740" (UID: "ed40e0c2-2091-4474-8e58-733bbb5cb740"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.006063 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed40e0c2-2091-4474-8e58-733bbb5cb740-kube-api-access-xqd78" (OuterVolumeSpecName: "kube-api-access-xqd78") pod "ed40e0c2-2091-4474-8e58-733bbb5cb740" (UID: "ed40e0c2-2091-4474-8e58-733bbb5cb740"). InnerVolumeSpecName "kube-api-access-xqd78". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.047255 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed40e0c2-2091-4474-8e58-733bbb5cb740" (UID: "ed40e0c2-2091-4474-8e58-733bbb5cb740"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.102486 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqd78\" (UniqueName: \"kubernetes.io/projected/ed40e0c2-2091-4474-8e58-733bbb5cb740-kube-api-access-xqd78\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.102871 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.103105 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed40e0c2-2091-4474-8e58-733bbb5cb740-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.301258 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:44:33 crc kubenswrapper[4922]: E1122 03:44:33.302024 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.337953 4922 generic.go:334] "Generic (PLEG): container finished" podID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerID="5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6" exitCode=0 Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.338007 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerDied","Data":"5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6"} Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.338722 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8548" event={"ID":"ed40e0c2-2091-4474-8e58-733bbb5cb740","Type":"ContainerDied","Data":"279676d0a1955d3ea29d09e97ac6d68ad3fcd00172ad7bbc6e4ca5afec73f905"} Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.338055 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8548" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.338741 4922 scope.go:117] "RemoveContainer" containerID="5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.367295 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8548"] Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.373695 4922 scope.go:117] "RemoveContainer" containerID="eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.376549 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8548"] Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.406138 4922 scope.go:117] "RemoveContainer" containerID="90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.438579 4922 scope.go:117] "RemoveContainer" containerID="5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6" Nov 22 03:44:33 crc kubenswrapper[4922]: E1122 03:44:33.439041 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6\": container with ID starting with 5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6 not found: ID does not exist" containerID="5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.439086 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6"} err="failed to get container status \"5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6\": rpc error: code = NotFound desc = could not find container \"5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6\": container with ID starting with 5b1f4b063af40b28e6cf90adb99a6ea794ad5e918a5214148aede502974eada6 not found: ID does not exist" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.439116 4922 scope.go:117] "RemoveContainer" containerID="eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166" Nov 22 03:44:33 crc kubenswrapper[4922]: E1122 03:44:33.439769 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166\": container with ID starting with eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166 not found: ID does not exist" containerID="eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.439808 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166"} err="failed to get container status \"eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166\": rpc error: code = NotFound desc = could not find container \"eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166\": container with ID starting with eea04af7b2f5fc7f2edae9ae119daa9ac52c54fb78959227a833474341617166 not found: ID does not exist" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.439837 4922 scope.go:117] "RemoveContainer" containerID="90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb" Nov 22 03:44:33 crc kubenswrapper[4922]: E1122 03:44:33.440219 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb\": container with ID starting with 90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb not found: ID does not exist" containerID="90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb" Nov 22 03:44:33 crc kubenswrapper[4922]: I1122 03:44:33.440257 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb"} err="failed to get container status \"90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb\": rpc error: code = NotFound desc = could not find container \"90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb\": container with ID starting with 90d426e2d5599b55b5ef53a1d2e1fcb81ce0277c555c3ec04ec1def602c8abeb not found: ID does not exist" Nov 22 03:44:35 crc kubenswrapper[4922]: I1122 03:44:35.311839 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" path="/var/lib/kubelet/pods/ed40e0c2-2091-4474-8e58-733bbb5cb740/volumes" Nov 22 03:44:44 crc kubenswrapper[4922]: I1122 03:44:44.301215 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:44:44 crc kubenswrapper[4922]: E1122 03:44:44.302334 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:44:59 crc kubenswrapper[4922]: I1122 03:44:59.302066 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:44:59 crc kubenswrapper[4922]: E1122 03:44:59.303272 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.178965 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz"] Nov 22 03:45:00 crc kubenswrapper[4922]: E1122 03:45:00.179493 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="extract-content" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.179518 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="extract-content" Nov 22 03:45:00 crc kubenswrapper[4922]: E1122 03:45:00.179573 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="extract-utilities" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.179584 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="extract-utilities" Nov 22 03:45:00 crc kubenswrapper[4922]: E1122 03:45:00.179615 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="registry-server" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.179624 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="registry-server" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.179897 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed40e0c2-2091-4474-8e58-733bbb5cb740" containerName="registry-server" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.180658 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.182883 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.183438 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.187733 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz"] Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.324058 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rckkz\" (UniqueName: \"kubernetes.io/projected/2b6ecc81-9e3b-4565-9e48-121e72c75330-kube-api-access-rckkz\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.324187 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b6ecc81-9e3b-4565-9e48-121e72c75330-secret-volume\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.324257 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6ecc81-9e3b-4565-9e48-121e72c75330-config-volume\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.426993 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rckkz\" (UniqueName: \"kubernetes.io/projected/2b6ecc81-9e3b-4565-9e48-121e72c75330-kube-api-access-rckkz\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.427090 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b6ecc81-9e3b-4565-9e48-121e72c75330-secret-volume\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.427185 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6ecc81-9e3b-4565-9e48-121e72c75330-config-volume\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.435949 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6ecc81-9e3b-4565-9e48-121e72c75330-config-volume\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.445269 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b6ecc81-9e3b-4565-9e48-121e72c75330-secret-volume\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.459321 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rckkz\" (UniqueName: \"kubernetes.io/projected/2b6ecc81-9e3b-4565-9e48-121e72c75330-kube-api-access-rckkz\") pod \"collect-profiles-29396385-sllcz\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.508950 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:00 crc kubenswrapper[4922]: I1122 03:45:00.977229 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz"] Nov 22 03:45:01 crc kubenswrapper[4922]: I1122 03:45:01.668595 4922 generic.go:334] "Generic (PLEG): container finished" podID="2b6ecc81-9e3b-4565-9e48-121e72c75330" containerID="a916b611ca943e59d8578a65928fc27176644c8a956ca4378e9c29d8624d2e83" exitCode=0 Nov 22 03:45:01 crc kubenswrapper[4922]: I1122 03:45:01.669116 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" event={"ID":"2b6ecc81-9e3b-4565-9e48-121e72c75330","Type":"ContainerDied","Data":"a916b611ca943e59d8578a65928fc27176644c8a956ca4378e9c29d8624d2e83"} Nov 22 03:45:01 crc kubenswrapper[4922]: I1122 03:45:01.669168 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" event={"ID":"2b6ecc81-9e3b-4565-9e48-121e72c75330","Type":"ContainerStarted","Data":"0fb43751b39df530c0330ba45c0a5b049345cc02adb7dfde0b7a9447065a6230"} Nov 22 03:45:02 crc kubenswrapper[4922]: I1122 03:45:02.993279 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.088705 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6ecc81-9e3b-4565-9e48-121e72c75330-config-volume\") pod \"2b6ecc81-9e3b-4565-9e48-121e72c75330\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.088758 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b6ecc81-9e3b-4565-9e48-121e72c75330-secret-volume\") pod \"2b6ecc81-9e3b-4565-9e48-121e72c75330\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.089141 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rckkz\" (UniqueName: \"kubernetes.io/projected/2b6ecc81-9e3b-4565-9e48-121e72c75330-kube-api-access-rckkz\") pod \"2b6ecc81-9e3b-4565-9e48-121e72c75330\" (UID: \"2b6ecc81-9e3b-4565-9e48-121e72c75330\") " Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.089557 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b6ecc81-9e3b-4565-9e48-121e72c75330-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b6ecc81-9e3b-4565-9e48-121e72c75330" (UID: "2b6ecc81-9e3b-4565-9e48-121e72c75330"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.089820 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6ecc81-9e3b-4565-9e48-121e72c75330-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.096807 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ecc81-9e3b-4565-9e48-121e72c75330-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b6ecc81-9e3b-4565-9e48-121e72c75330" (UID: "2b6ecc81-9e3b-4565-9e48-121e72c75330"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.098274 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6ecc81-9e3b-4565-9e48-121e72c75330-kube-api-access-rckkz" (OuterVolumeSpecName: "kube-api-access-rckkz") pod "2b6ecc81-9e3b-4565-9e48-121e72c75330" (UID: "2b6ecc81-9e3b-4565-9e48-121e72c75330"). InnerVolumeSpecName "kube-api-access-rckkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.191494 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rckkz\" (UniqueName: \"kubernetes.io/projected/2b6ecc81-9e3b-4565-9e48-121e72c75330-kube-api-access-rckkz\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.191795 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b6ecc81-9e3b-4565-9e48-121e72c75330-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.695611 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" event={"ID":"2b6ecc81-9e3b-4565-9e48-121e72c75330","Type":"ContainerDied","Data":"0fb43751b39df530c0330ba45c0a5b049345cc02adb7dfde0b7a9447065a6230"} Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.696231 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb43751b39df530c0330ba45c0a5b049345cc02adb7dfde0b7a9447065a6230" Nov 22 03:45:03 crc kubenswrapper[4922]: I1122 03:45:03.695944 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396385-sllcz" Nov 22 03:45:04 crc kubenswrapper[4922]: I1122 03:45:04.071525 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq"] Nov 22 03:45:04 crc kubenswrapper[4922]: I1122 03:45:04.083123 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396340-k9lhq"] Nov 22 03:45:05 crc kubenswrapper[4922]: I1122 03:45:05.350518 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c756ee80-4d4a-4877-8849-7f5cc170ecf9" path="/var/lib/kubelet/pods/c756ee80-4d4a-4877-8849-7f5cc170ecf9/volumes" Nov 22 03:45:12 crc kubenswrapper[4922]: I1122 03:45:12.302105 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:45:12 crc kubenswrapper[4922]: E1122 03:45:12.302833 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.250498 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m5pcd"] Nov 22 03:45:23 crc kubenswrapper[4922]: E1122 03:45:23.252185 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ecc81-9e3b-4565-9e48-121e72c75330" containerName="collect-profiles" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.252206 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ecc81-9e3b-4565-9e48-121e72c75330" containerName="collect-profiles" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.252527 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ecc81-9e3b-4565-9e48-121e72c75330" containerName="collect-profiles" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.254836 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.271917 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5pcd"] Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.377240 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-catalog-content\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.377507 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qj8\" (UniqueName: \"kubernetes.io/projected/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-kube-api-access-r7qj8\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.377709 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-utilities\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.479945 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-catalog-content\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.480466 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-catalog-content\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.480594 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qj8\" (UniqueName: \"kubernetes.io/projected/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-kube-api-access-r7qj8\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.480667 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-utilities\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.482689 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-utilities\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.508409 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qj8\" (UniqueName: \"kubernetes.io/projected/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-kube-api-access-r7qj8\") pod \"redhat-marketplace-m5pcd\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:23 crc kubenswrapper[4922]: I1122 03:45:23.612957 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:24 crc kubenswrapper[4922]: I1122 03:45:24.092073 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5pcd"] Nov 22 03:45:24 crc kubenswrapper[4922]: I1122 03:45:24.929252 4922 generic.go:334] "Generic (PLEG): container finished" podID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerID="13b5c7ee31b734092b943769f917eb7aa09e6e46e6f8e51b51461e73a916df7a" exitCode=0 Nov 22 03:45:24 crc kubenswrapper[4922]: I1122 03:45:24.929320 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5pcd" event={"ID":"a6ef9572-baf9-41cb-9e96-6ca05bff93d9","Type":"ContainerDied","Data":"13b5c7ee31b734092b943769f917eb7aa09e6e46e6f8e51b51461e73a916df7a"} Nov 22 03:45:24 crc kubenswrapper[4922]: I1122 03:45:24.929517 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5pcd" event={"ID":"a6ef9572-baf9-41cb-9e96-6ca05bff93d9","Type":"ContainerStarted","Data":"853aedc817c5af06f257aa4e13cecb08a1a54609dd8155611df13a166b955171"} Nov 22 03:45:25 crc kubenswrapper[4922]: I1122 03:45:25.313542 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:45:25 crc kubenswrapper[4922]: E1122 03:45:25.314020 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:45:26 crc kubenswrapper[4922]: I1122 03:45:26.956538 4922 generic.go:334] "Generic (PLEG): container finished" podID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerID="0caf7a36713a312d73c6c2c6efe2ad64fa9f83b398f7afa39f029a2fb8d0c3ed" exitCode=0 Nov 22 03:45:26 crc kubenswrapper[4922]: I1122 03:45:26.956669 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5pcd" event={"ID":"a6ef9572-baf9-41cb-9e96-6ca05bff93d9","Type":"ContainerDied","Data":"0caf7a36713a312d73c6c2c6efe2ad64fa9f83b398f7afa39f029a2fb8d0c3ed"} Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.048950 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l42l6"] Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.051715 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.059959 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l42l6"] Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.060092 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-utilities\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.060204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggn8m\" (UniqueName: \"kubernetes.io/projected/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-kube-api-access-ggn8m\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.060279 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-catalog-content\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.163121 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-utilities\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.163699 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggn8m\" (UniqueName: \"kubernetes.io/projected/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-kube-api-access-ggn8m\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.163828 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-catalog-content\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.164011 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-utilities\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.165372 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-catalog-content\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.184050 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggn8m\" (UniqueName: \"kubernetes.io/projected/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-kube-api-access-ggn8m\") pod \"redhat-operators-l42l6\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.385459 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.917415 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l42l6"] Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.975746 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5pcd" event={"ID":"a6ef9572-baf9-41cb-9e96-6ca05bff93d9","Type":"ContainerStarted","Data":"6a6c98c4f98b2e2dd901e937bc4bf30975f2d057f721d5570deced853e702ae2"} Nov 22 03:45:27 crc kubenswrapper[4922]: I1122 03:45:27.977586 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerStarted","Data":"6458cd3916fa762f284c34350d3c06cb3133ab194b91fcd9c8e1ba21b1331e6f"} Nov 22 03:45:28 crc kubenswrapper[4922]: I1122 03:45:28.005205 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m5pcd" podStartSLOduration=2.517497764 podStartE2EDuration="5.005190336s" podCreationTimestamp="2025-11-22 03:45:23 +0000 UTC" firstStartedPulling="2025-11-22 03:45:24.933251574 +0000 UTC m=+3160.971773466" lastFinishedPulling="2025-11-22 03:45:27.420944146 +0000 UTC m=+3163.459466038" observedRunningTime="2025-11-22 03:45:28.003809143 +0000 UTC m=+3164.042331045" watchObservedRunningTime="2025-11-22 03:45:28.005190336 +0000 UTC m=+3164.043712228" Nov 22 03:45:29 crc kubenswrapper[4922]: I1122 03:45:29.000787 4922 generic.go:334] "Generic (PLEG): container finished" podID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerID="2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0" exitCode=0 Nov 22 03:45:29 crc kubenswrapper[4922]: I1122 03:45:29.001030 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerDied","Data":"2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0"} Nov 22 03:45:32 crc kubenswrapper[4922]: I1122 03:45:32.047664 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerStarted","Data":"6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990"} Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.608983 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.611359 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.613434 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.613545 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.616719 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.617823 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.618383 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wdc5v" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.621712 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.628413 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.631901 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-config-data\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.632472 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.632639 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.689611 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734475 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734580 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734631 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734795 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-config-data\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734924 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7lc\" (UniqueName: \"kubernetes.io/projected/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-kube-api-access-9c7lc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.734998 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.735246 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.735284 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.737412 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.738829 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-config-data\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.743701 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.838193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7lc\" (UniqueName: \"kubernetes.io/projected/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-kube-api-access-9c7lc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.838253 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.838385 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.838486 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.838518 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.838610 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:33 crc kubenswrapper[4922]: I1122 03:45:33.839959 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.075819 4922 generic.go:334] "Generic (PLEG): container finished" podID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerID="6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990" exitCode=0 Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.076310 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerDied","Data":"6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990"} Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.149950 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.295000 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.303648 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.588180 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.589207 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.589650 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7lc\" (UniqueName: \"kubernetes.io/projected/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-kube-api-access-9c7lc\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.594985 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " pod="openstack/tempest-tests-tempest" Nov 22 03:45:34 crc kubenswrapper[4922]: I1122 03:45:34.852375 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 03:45:35 crc kubenswrapper[4922]: I1122 03:45:35.384973 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 22 03:45:35 crc kubenswrapper[4922]: W1122 03:45:35.394408 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f03af2_87a1_4f4f_b09c_00fe2a3d4943.slice/crio-0735a707b83760c5e430b1e6905d424c0256ab866269415101899b0204eeca1b WatchSource:0}: Error finding container 0735a707b83760c5e430b1e6905d424c0256ab866269415101899b0204eeca1b: Status 404 returned error can't find the container with id 0735a707b83760c5e430b1e6905d424c0256ab866269415101899b0204eeca1b Nov 22 03:45:36 crc kubenswrapper[4922]: I1122 03:45:36.043296 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5pcd"] Nov 22 03:45:36 crc kubenswrapper[4922]: I1122 03:45:36.107659 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65f03af2-87a1-4f4f-b09c-00fe2a3d4943","Type":"ContainerStarted","Data":"0735a707b83760c5e430b1e6905d424c0256ab866269415101899b0204eeca1b"} Nov 22 03:45:36 crc kubenswrapper[4922]: I1122 03:45:36.109294 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m5pcd" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="registry-server" containerID="cri-o://6a6c98c4f98b2e2dd901e937bc4bf30975f2d057f721d5570deced853e702ae2" gracePeriod=2 Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.148546 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerStarted","Data":"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c"} Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.151451 4922 generic.go:334] "Generic (PLEG): container finished" podID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerID="6a6c98c4f98b2e2dd901e937bc4bf30975f2d057f721d5570deced853e702ae2" exitCode=0 Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.151477 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5pcd" event={"ID":"a6ef9572-baf9-41cb-9e96-6ca05bff93d9","Type":"ContainerDied","Data":"6a6c98c4f98b2e2dd901e937bc4bf30975f2d057f721d5570deced853e702ae2"} Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.301214 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:45:38 crc kubenswrapper[4922]: E1122 03:45:38.301701 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.713041 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.859270 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-utilities\") pod \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.860256 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-utilities" (OuterVolumeSpecName: "utilities") pod "a6ef9572-baf9-41cb-9e96-6ca05bff93d9" (UID: "a6ef9572-baf9-41cb-9e96-6ca05bff93d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.860509 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-catalog-content\") pod \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.861038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qj8\" (UniqueName: \"kubernetes.io/projected/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-kube-api-access-r7qj8\") pod \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\" (UID: \"a6ef9572-baf9-41cb-9e96-6ca05bff93d9\") " Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.862150 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.876117 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6ef9572-baf9-41cb-9e96-6ca05bff93d9" (UID: "a6ef9572-baf9-41cb-9e96-6ca05bff93d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.876654 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-kube-api-access-r7qj8" (OuterVolumeSpecName: "kube-api-access-r7qj8") pod "a6ef9572-baf9-41cb-9e96-6ca05bff93d9" (UID: "a6ef9572-baf9-41cb-9e96-6ca05bff93d9"). InnerVolumeSpecName "kube-api-access-r7qj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.964302 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7qj8\" (UniqueName: \"kubernetes.io/projected/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-kube-api-access-r7qj8\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:38 crc kubenswrapper[4922]: I1122 03:45:38.964333 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef9572-baf9-41cb-9e96-6ca05bff93d9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.174406 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m5pcd" Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.174406 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m5pcd" event={"ID":"a6ef9572-baf9-41cb-9e96-6ca05bff93d9","Type":"ContainerDied","Data":"853aedc817c5af06f257aa4e13cecb08a1a54609dd8155611df13a166b955171"} Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.174485 4922 scope.go:117] "RemoveContainer" containerID="6a6c98c4f98b2e2dd901e937bc4bf30975f2d057f721d5570deced853e702ae2" Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.210773 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l42l6" podStartSLOduration=3.39228666 podStartE2EDuration="12.210747544s" podCreationTimestamp="2025-11-22 03:45:27 +0000 UTC" firstStartedPulling="2025-11-22 03:45:29.003600062 +0000 UTC m=+3165.042121954" lastFinishedPulling="2025-11-22 03:45:37.822060946 +0000 UTC m=+3173.860582838" observedRunningTime="2025-11-22 03:45:39.193742626 +0000 UTC m=+3175.232264518" watchObservedRunningTime="2025-11-22 03:45:39.210747544 +0000 UTC m=+3175.249269436" Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.221291 4922 scope.go:117] "RemoveContainer" containerID="0caf7a36713a312d73c6c2c6efe2ad64fa9f83b398f7afa39f029a2fb8d0c3ed" Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.224179 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5pcd"] Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.235022 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m5pcd"] Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.250066 4922 scope.go:117] "RemoveContainer" containerID="13b5c7ee31b734092b943769f917eb7aa09e6e46e6f8e51b51461e73a916df7a" Nov 22 03:45:39 crc kubenswrapper[4922]: I1122 03:45:39.322925 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" path="/var/lib/kubelet/pods/a6ef9572-baf9-41cb-9e96-6ca05bff93d9/volumes" Nov 22 03:45:47 crc kubenswrapper[4922]: I1122 03:45:47.386387 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:47 crc kubenswrapper[4922]: I1122 03:45:47.387111 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:45:48 crc kubenswrapper[4922]: I1122 03:45:48.431009 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:45:48 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:45:48 crc kubenswrapper[4922]: > Nov 22 03:45:49 crc kubenswrapper[4922]: I1122 03:45:49.300470 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:45:58 crc kubenswrapper[4922]: I1122 03:45:58.432458 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:45:58 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:45:58 crc kubenswrapper[4922]: > Nov 22 03:46:02 crc kubenswrapper[4922]: I1122 03:46:02.788991 4922 scope.go:117] "RemoveContainer" containerID="0f35149bf9822ee3e83f466c6f558c730a8bf202250ee0aa16a8401eccac4f43" Nov 22 03:46:07 crc kubenswrapper[4922]: E1122 03:46:07.205173 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 22 03:46:07 crc kubenswrapper[4922]: E1122 03:46:07.205687 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9c7lc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(65f03af2-87a1-4f4f-b09c-00fe2a3d4943): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 03:46:07 crc kubenswrapper[4922]: E1122 03:46:07.207083 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="65f03af2-87a1-4f4f-b09c-00fe2a3d4943" Nov 22 03:46:07 crc kubenswrapper[4922]: E1122 03:46:07.523385 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="65f03af2-87a1-4f4f-b09c-00fe2a3d4943" Nov 22 03:46:08 crc kubenswrapper[4922]: I1122 03:46:08.428738 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:46:08 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:46:08 crc kubenswrapper[4922]: > Nov 22 03:46:11 crc kubenswrapper[4922]: I1122 03:46:11.566109 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"997b77b83cbbb54ae217d6919fdc8a3576683b7c25d5fcc3a2d2ad19d864e43a"} Nov 22 03:46:18 crc kubenswrapper[4922]: I1122 03:46:18.434407 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:46:18 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:46:18 crc kubenswrapper[4922]: > Nov 22 03:46:22 crc kubenswrapper[4922]: I1122 03:46:22.959698 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 22 03:46:24 crc kubenswrapper[4922]: I1122 03:46:24.708121 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65f03af2-87a1-4f4f-b09c-00fe2a3d4943","Type":"ContainerStarted","Data":"61527639731a9b825273f57e13396f3ce5387671b2a8e33ae65427ba02f4840d"} Nov 22 03:46:24 crc kubenswrapper[4922]: I1122 03:46:24.731250 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.869820203 podStartE2EDuration="52.731226613s" podCreationTimestamp="2025-11-22 03:45:32 +0000 UTC" firstStartedPulling="2025-11-22 03:45:36.095705988 +0000 UTC m=+3172.134227920" lastFinishedPulling="2025-11-22 03:46:22.957112428 +0000 UTC m=+3218.995634330" observedRunningTime="2025-11-22 03:46:24.729786058 +0000 UTC m=+3220.768307950" watchObservedRunningTime="2025-11-22 03:46:24.731226613 +0000 UTC m=+3220.769748525" Nov 22 03:46:28 crc kubenswrapper[4922]: I1122 03:46:28.462796 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:46:28 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:46:28 crc kubenswrapper[4922]: > Nov 22 03:46:38 crc kubenswrapper[4922]: I1122 03:46:38.447264 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:46:38 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:46:38 crc kubenswrapper[4922]: > Nov 22 03:46:48 crc kubenswrapper[4922]: I1122 03:46:48.467238 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:46:48 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:46:48 crc kubenswrapper[4922]: > Nov 22 03:46:58 crc kubenswrapper[4922]: I1122 03:46:58.445733 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:46:58 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:46:58 crc kubenswrapper[4922]: > Nov 22 03:47:08 crc kubenswrapper[4922]: I1122 03:47:08.466507 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:47:08 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:47:08 crc kubenswrapper[4922]: > Nov 22 03:47:18 crc kubenswrapper[4922]: I1122 03:47:18.430679 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:47:18 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:47:18 crc kubenswrapper[4922]: > Nov 22 03:47:18 crc kubenswrapper[4922]: I1122 03:47:18.431346 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:18 crc kubenswrapper[4922]: I1122 03:47:18.432236 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c"} pod="openshift-marketplace/redhat-operators-l42l6" containerMessage="Container registry-server failed startup probe, will be restarted" Nov 22 03:47:18 crc kubenswrapper[4922]: I1122 03:47:18.432268 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" containerID="cri-o://2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c" gracePeriod=30 Nov 22 03:47:22 crc kubenswrapper[4922]: I1122 03:47:22.357923 4922 generic.go:334] "Generic (PLEG): container finished" podID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerID="2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c" exitCode=0 Nov 22 03:47:22 crc kubenswrapper[4922]: I1122 03:47:22.357995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerDied","Data":"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c"} Nov 22 03:47:23 crc kubenswrapper[4922]: I1122 03:47:23.369753 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerStarted","Data":"de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75"} Nov 22 03:47:27 crc kubenswrapper[4922]: I1122 03:47:27.385795 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:27 crc kubenswrapper[4922]: I1122 03:47:27.386824 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:28 crc kubenswrapper[4922]: I1122 03:47:28.446353 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" probeResult="failure" output=< Nov 22 03:47:28 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:47:28 crc kubenswrapper[4922]: > Nov 22 03:47:37 crc kubenswrapper[4922]: I1122 03:47:37.437207 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:37 crc kubenswrapper[4922]: I1122 03:47:37.509587 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:37 crc kubenswrapper[4922]: I1122 03:47:37.684334 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l42l6"] Nov 22 03:47:38 crc kubenswrapper[4922]: I1122 03:47:38.553340 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l42l6" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" containerID="cri-o://de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75" gracePeriod=2 Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.136657 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.237161 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggn8m\" (UniqueName: \"kubernetes.io/projected/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-kube-api-access-ggn8m\") pod \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.237272 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-utilities\") pod \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.237375 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-catalog-content\") pod \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\" (UID: \"1df9d696-c4c0-4c45-88fa-65229fe7fd5c\") " Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.238446 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-utilities" (OuterVolumeSpecName: "utilities") pod "1df9d696-c4c0-4c45-88fa-65229fe7fd5c" (UID: "1df9d696-c4c0-4c45-88fa-65229fe7fd5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.251772 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-kube-api-access-ggn8m" (OuterVolumeSpecName: "kube-api-access-ggn8m") pod "1df9d696-c4c0-4c45-88fa-65229fe7fd5c" (UID: "1df9d696-c4c0-4c45-88fa-65229fe7fd5c"). InnerVolumeSpecName "kube-api-access-ggn8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.340395 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggn8m\" (UniqueName: \"kubernetes.io/projected/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-kube-api-access-ggn8m\") on node \"crc\" DevicePath \"\"" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.340735 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.350804 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1df9d696-c4c0-4c45-88fa-65229fe7fd5c" (UID: "1df9d696-c4c0-4c45-88fa-65229fe7fd5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.442160 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df9d696-c4c0-4c45-88fa-65229fe7fd5c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.567315 4922 generic.go:334] "Generic (PLEG): container finished" podID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerID="de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75" exitCode=0 Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.567382 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerDied","Data":"de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75"} Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.567428 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l42l6" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.567513 4922 scope.go:117] "RemoveContainer" containerID="de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.567491 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l42l6" event={"ID":"1df9d696-c4c0-4c45-88fa-65229fe7fd5c","Type":"ContainerDied","Data":"6458cd3916fa762f284c34350d3c06cb3133ab194b91fcd9c8e1ba21b1331e6f"} Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.601035 4922 scope.go:117] "RemoveContainer" containerID="2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.608180 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l42l6"] Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.620440 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l42l6"] Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.624728 4922 scope.go:117] "RemoveContainer" containerID="6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.648806 4922 scope.go:117] "RemoveContainer" containerID="2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.695609 4922 scope.go:117] "RemoveContainer" containerID="de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75" Nov 22 03:47:39 crc kubenswrapper[4922]: E1122 03:47:39.696130 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75\": container with ID starting with de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75 not found: ID does not exist" containerID="de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.696175 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75"} err="failed to get container status \"de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75\": rpc error: code = NotFound desc = could not find container \"de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75\": container with ID starting with de59976fa149367a9ca4bc50e4d879d16e1bc984b71a80accd855ef908745a75 not found: ID does not exist" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.696211 4922 scope.go:117] "RemoveContainer" containerID="2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c" Nov 22 03:47:39 crc kubenswrapper[4922]: E1122 03:47:39.696741 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c\": container with ID starting with 2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c not found: ID does not exist" containerID="2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.696774 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c"} err="failed to get container status \"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c\": rpc error: code = NotFound desc = could not find container \"2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c\": container with ID starting with 2c6ae5dd084deb330428a60e739eb7c64448d192a2c88035b45e13727594b62c not found: ID does not exist" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.696797 4922 scope.go:117] "RemoveContainer" containerID="6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990" Nov 22 03:47:39 crc kubenswrapper[4922]: E1122 03:47:39.697093 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990\": container with ID starting with 6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990 not found: ID does not exist" containerID="6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.697159 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990"} err="failed to get container status \"6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990\": rpc error: code = NotFound desc = could not find container \"6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990\": container with ID starting with 6412f3f3ca8c8d8aecbfd1c1fa062423e06f90be2ac92723c17541c326fb6990 not found: ID does not exist" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.697200 4922 scope.go:117] "RemoveContainer" containerID="2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0" Nov 22 03:47:39 crc kubenswrapper[4922]: E1122 03:47:39.698083 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0\": container with ID starting with 2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0 not found: ID does not exist" containerID="2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0" Nov 22 03:47:39 crc kubenswrapper[4922]: I1122 03:47:39.698121 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0"} err="failed to get container status \"2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0\": rpc error: code = NotFound desc = could not find container \"2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0\": container with ID starting with 2851578a3234b1bc4fb8291e24855cf394ce2d60a25afead401020d3fcac4fe0 not found: ID does not exist" Nov 22 03:47:41 crc kubenswrapper[4922]: I1122 03:47:41.320927 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" path="/var/lib/kubelet/pods/1df9d696-c4c0-4c45-88fa-65229fe7fd5c/volumes" Nov 22 03:48:07 crc kubenswrapper[4922]: I1122 03:48:07.315764 4922 scope.go:117] "RemoveContainer" containerID="d17a013062b65b71ca65905caeabdf9630b11f66837abd7ae7aa2f7ff4991f57" Nov 22 03:48:07 crc kubenswrapper[4922]: I1122 03:48:07.362163 4922 scope.go:117] "RemoveContainer" containerID="e4172101e6df99d08c4768b3c771214ed1362d1a4e040d9a25cc6d5a6358e886" Nov 22 03:48:07 crc kubenswrapper[4922]: I1122 03:48:07.424068 4922 scope.go:117] "RemoveContainer" containerID="b58ae2adeec2759ff9ac634d556f92e74c00074535cfac8e428037131b2fe970" Nov 22 03:48:11 crc kubenswrapper[4922]: I1122 03:48:11.109538 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:48:11 crc kubenswrapper[4922]: I1122 03:48:11.110256 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:48:41 crc kubenswrapper[4922]: I1122 03:48:41.110141 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:48:41 crc kubenswrapper[4922]: I1122 03:48:41.110638 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.109457 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.110214 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.110280 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.111261 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"997b77b83cbbb54ae217d6919fdc8a3576683b7c25d5fcc3a2d2ad19d864e43a"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.111351 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://997b77b83cbbb54ae217d6919fdc8a3576683b7c25d5fcc3a2d2ad19d864e43a" gracePeriod=600 Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.885937 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="997b77b83cbbb54ae217d6919fdc8a3576683b7c25d5fcc3a2d2ad19d864e43a" exitCode=0 Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.886047 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"997b77b83cbbb54ae217d6919fdc8a3576683b7c25d5fcc3a2d2ad19d864e43a"} Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.886881 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855"} Nov 22 03:49:11 crc kubenswrapper[4922]: I1122 03:49:11.886923 4922 scope.go:117] "RemoveContainer" containerID="3902a53e42446835819551f58e9b8e408324dfa52a1818359fa0616ad95080dc" Nov 22 03:51:11 crc kubenswrapper[4922]: I1122 03:51:11.110364 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:51:11 crc kubenswrapper[4922]: I1122 03:51:11.111077 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:51:41 crc kubenswrapper[4922]: I1122 03:51:41.109132 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:51:41 crc kubenswrapper[4922]: I1122 03:51:41.109765 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.109113 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.109740 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.109801 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.110805 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.110932 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" gracePeriod=600 Nov 22 03:52:11 crc kubenswrapper[4922]: E1122 03:52:11.769558 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.784235 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" exitCode=0 Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.784277 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855"} Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.784314 4922 scope.go:117] "RemoveContainer" containerID="997b77b83cbbb54ae217d6919fdc8a3576683b7c25d5fcc3a2d2ad19d864e43a" Nov 22 03:52:11 crc kubenswrapper[4922]: I1122 03:52:11.785335 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:52:11 crc kubenswrapper[4922]: E1122 03:52:11.786667 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:52:22 crc kubenswrapper[4922]: I1122 03:52:22.301752 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:52:22 crc kubenswrapper[4922]: E1122 03:52:22.302916 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:52:36 crc kubenswrapper[4922]: I1122 03:52:36.301608 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:52:36 crc kubenswrapper[4922]: E1122 03:52:36.302301 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.971290 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nbrkc"] Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973246 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="extract-content" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973267 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="extract-content" Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973290 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="extract-utilities" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973298 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="extract-utilities" Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973306 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973314 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973335 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973343 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973364 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973371 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973380 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="extract-content" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973388 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="extract-content" Nov 22 03:52:49 crc kubenswrapper[4922]: E1122 03:52:49.973409 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="extract-utilities" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973416 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="extract-utilities" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973639 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973657 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df9d696-c4c0-4c45-88fa-65229fe7fd5c" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.973671 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ef9572-baf9-41cb-9e96-6ca05bff93d9" containerName="registry-server" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.975380 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:49 crc kubenswrapper[4922]: I1122 03:52:49.987933 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbrkc"] Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.012379 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-catalog-content\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.012563 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxgs\" (UniqueName: \"kubernetes.io/projected/059bc9c2-5ddd-4511-a74c-234277bb3b63-kube-api-access-7zxgs\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.012672 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-utilities\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.114975 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-utilities\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.115338 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-catalog-content\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.115398 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxgs\" (UniqueName: \"kubernetes.io/projected/059bc9c2-5ddd-4511-a74c-234277bb3b63-kube-api-access-7zxgs\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.115570 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-utilities\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.115641 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-catalog-content\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.134797 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxgs\" (UniqueName: \"kubernetes.io/projected/059bc9c2-5ddd-4511-a74c-234277bb3b63-kube-api-access-7zxgs\") pod \"community-operators-nbrkc\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.296113 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:52:50 crc kubenswrapper[4922]: I1122 03:52:50.894115 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbrkc"] Nov 22 03:52:51 crc kubenswrapper[4922]: I1122 03:52:51.180314 4922 generic.go:334] "Generic (PLEG): container finished" podID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerID="6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084" exitCode=0 Nov 22 03:52:51 crc kubenswrapper[4922]: I1122 03:52:51.180602 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerDied","Data":"6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084"} Nov 22 03:52:51 crc kubenswrapper[4922]: I1122 03:52:51.180627 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerStarted","Data":"1de71aa1255dfff9b980b04572aaf686eeeda124d8dba573c0270ed536ddb3f2"} Nov 22 03:52:51 crc kubenswrapper[4922]: I1122 03:52:51.182696 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 03:52:51 crc kubenswrapper[4922]: I1122 03:52:51.301009 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:52:51 crc kubenswrapper[4922]: E1122 03:52:51.301247 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:52:52 crc kubenswrapper[4922]: I1122 03:52:52.195140 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerStarted","Data":"69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e"} Nov 22 03:52:53 crc kubenswrapper[4922]: I1122 03:52:53.205645 4922 generic.go:334] "Generic (PLEG): container finished" podID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerID="69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e" exitCode=0 Nov 22 03:52:53 crc kubenswrapper[4922]: I1122 03:52:53.207170 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerDied","Data":"69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e"} Nov 22 03:52:54 crc kubenswrapper[4922]: I1122 03:52:54.043125 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-gtphg"] Nov 22 03:52:54 crc kubenswrapper[4922]: I1122 03:52:54.078283 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-gtphg"] Nov 22 03:52:54 crc kubenswrapper[4922]: I1122 03:52:54.219612 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerStarted","Data":"2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5"} Nov 22 03:52:54 crc kubenswrapper[4922]: I1122 03:52:54.246103 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nbrkc" podStartSLOduration=2.705036997 podStartE2EDuration="5.246078708s" podCreationTimestamp="2025-11-22 03:52:49 +0000 UTC" firstStartedPulling="2025-11-22 03:52:51.182499439 +0000 UTC m=+3607.221021331" lastFinishedPulling="2025-11-22 03:52:53.72354109 +0000 UTC m=+3609.762063042" observedRunningTime="2025-11-22 03:52:54.239431558 +0000 UTC m=+3610.277953450" watchObservedRunningTime="2025-11-22 03:52:54.246078708 +0000 UTC m=+3610.284600600" Nov 22 03:52:55 crc kubenswrapper[4922]: I1122 03:52:55.317305 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c570feea-6723-4626-a849-fc67db11ee3e" path="/var/lib/kubelet/pods/c570feea-6723-4626-a849-fc67db11ee3e/volumes" Nov 22 03:53:00 crc kubenswrapper[4922]: I1122 03:53:00.296544 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:53:00 crc kubenswrapper[4922]: I1122 03:53:00.297927 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:53:00 crc kubenswrapper[4922]: I1122 03:53:00.348084 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:53:01 crc kubenswrapper[4922]: I1122 03:53:01.339003 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:53:01 crc kubenswrapper[4922]: I1122 03:53:01.394860 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbrkc"] Nov 22 03:53:03 crc kubenswrapper[4922]: I1122 03:53:03.300371 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:53:03 crc kubenswrapper[4922]: E1122 03:53:03.300896 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:53:03 crc kubenswrapper[4922]: I1122 03:53:03.312683 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nbrkc" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="registry-server" containerID="cri-o://2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5" gracePeriod=2 Nov 22 03:53:03 crc kubenswrapper[4922]: I1122 03:53:03.964865 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.055971 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zxgs\" (UniqueName: \"kubernetes.io/projected/059bc9c2-5ddd-4511-a74c-234277bb3b63-kube-api-access-7zxgs\") pod \"059bc9c2-5ddd-4511-a74c-234277bb3b63\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.056037 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-catalog-content\") pod \"059bc9c2-5ddd-4511-a74c-234277bb3b63\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.056164 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-utilities\") pod \"059bc9c2-5ddd-4511-a74c-234277bb3b63\" (UID: \"059bc9c2-5ddd-4511-a74c-234277bb3b63\") " Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.056738 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-utilities" (OuterVolumeSpecName: "utilities") pod "059bc9c2-5ddd-4511-a74c-234277bb3b63" (UID: "059bc9c2-5ddd-4511-a74c-234277bb3b63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.061825 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059bc9c2-5ddd-4511-a74c-234277bb3b63-kube-api-access-7zxgs" (OuterVolumeSpecName: "kube-api-access-7zxgs") pod "059bc9c2-5ddd-4511-a74c-234277bb3b63" (UID: "059bc9c2-5ddd-4511-a74c-234277bb3b63"). InnerVolumeSpecName "kube-api-access-7zxgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.103653 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "059bc9c2-5ddd-4511-a74c-234277bb3b63" (UID: "059bc9c2-5ddd-4511-a74c-234277bb3b63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.159000 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.159031 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zxgs\" (UniqueName: \"kubernetes.io/projected/059bc9c2-5ddd-4511-a74c-234277bb3b63-kube-api-access-7zxgs\") on node \"crc\" DevicePath \"\"" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.159044 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/059bc9c2-5ddd-4511-a74c-234277bb3b63-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.324926 4922 generic.go:334] "Generic (PLEG): container finished" podID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerID="2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5" exitCode=0 Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.324994 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbrkc" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.325013 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerDied","Data":"2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5"} Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.325437 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbrkc" event={"ID":"059bc9c2-5ddd-4511-a74c-234277bb3b63","Type":"ContainerDied","Data":"1de71aa1255dfff9b980b04572aaf686eeeda124d8dba573c0270ed536ddb3f2"} Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.325462 4922 scope.go:117] "RemoveContainer" containerID="2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.359779 4922 scope.go:117] "RemoveContainer" containerID="69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.372442 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbrkc"] Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.382882 4922 scope.go:117] "RemoveContainer" containerID="6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.387122 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nbrkc"] Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.451412 4922 scope.go:117] "RemoveContainer" containerID="2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5" Nov 22 03:53:04 crc kubenswrapper[4922]: E1122 03:53:04.452424 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5\": container with ID starting with 2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5 not found: ID does not exist" containerID="2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.452474 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5"} err="failed to get container status \"2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5\": rpc error: code = NotFound desc = could not find container \"2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5\": container with ID starting with 2bbfd4d4e99ee2b5b1069c8df31329521e3fcee791789c13777d34f8872459e5 not found: ID does not exist" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.452504 4922 scope.go:117] "RemoveContainer" containerID="69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e" Nov 22 03:53:04 crc kubenswrapper[4922]: E1122 03:53:04.452951 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e\": container with ID starting with 69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e not found: ID does not exist" containerID="69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.452985 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e"} err="failed to get container status \"69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e\": rpc error: code = NotFound desc = could not find container \"69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e\": container with ID starting with 69062bce168824ead59d60c392d0c0794745e627c6188762ba78e08be046107e not found: ID does not exist" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.453003 4922 scope.go:117] "RemoveContainer" containerID="6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084" Nov 22 03:53:04 crc kubenswrapper[4922]: E1122 03:53:04.453612 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084\": container with ID starting with 6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084 not found: ID does not exist" containerID="6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084" Nov 22 03:53:04 crc kubenswrapper[4922]: I1122 03:53:04.453656 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084"} err="failed to get container status \"6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084\": rpc error: code = NotFound desc = could not find container \"6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084\": container with ID starting with 6c77b35d992487c08a5112a351dc9936b5630592cae557d6bb29b5c2c3304084 not found: ID does not exist" Nov 22 03:53:05 crc kubenswrapper[4922]: I1122 03:53:05.313593 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" path="/var/lib/kubelet/pods/059bc9c2-5ddd-4511-a74c-234277bb3b63/volumes" Nov 22 03:53:06 crc kubenswrapper[4922]: I1122 03:53:06.037612 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-13c6-account-create-2dlsc"] Nov 22 03:53:06 crc kubenswrapper[4922]: I1122 03:53:06.043714 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-13c6-account-create-2dlsc"] Nov 22 03:53:07 crc kubenswrapper[4922]: I1122 03:53:07.325591 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da26cef4-8297-4b18-a465-3d69e4ae01f8" path="/var/lib/kubelet/pods/da26cef4-8297-4b18-a465-3d69e4ae01f8/volumes" Nov 22 03:53:07 crc kubenswrapper[4922]: I1122 03:53:07.590411 4922 scope.go:117] "RemoveContainer" containerID="55bb4c3b28ff64699506229b173e8dd86d6b8f70b8799297ce860d83a5152a1f" Nov 22 03:53:07 crc kubenswrapper[4922]: I1122 03:53:07.663930 4922 scope.go:117] "RemoveContainer" containerID="a575d9415469463145ffa9f96922a683786ebd802710278e2df33bcf76d590e6" Nov 22 03:53:17 crc kubenswrapper[4922]: I1122 03:53:17.312320 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:53:17 crc kubenswrapper[4922]: E1122 03:53:17.315387 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:53:30 crc kubenswrapper[4922]: I1122 03:53:30.300455 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:53:30 crc kubenswrapper[4922]: E1122 03:53:30.301355 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:53:33 crc kubenswrapper[4922]: I1122 03:53:33.040791 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-wfl57"] Nov 22 03:53:33 crc kubenswrapper[4922]: I1122 03:53:33.057272 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-wfl57"] Nov 22 03:53:33 crc kubenswrapper[4922]: I1122 03:53:33.310074 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7904f33-eb73-4d09-b13f-4faf6ec90328" path="/var/lib/kubelet/pods/f7904f33-eb73-4d09-b13f-4faf6ec90328/volumes" Nov 22 03:53:45 crc kubenswrapper[4922]: I1122 03:53:45.327632 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:53:45 crc kubenswrapper[4922]: E1122 03:53:45.328733 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:53:58 crc kubenswrapper[4922]: I1122 03:53:58.300289 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:53:58 crc kubenswrapper[4922]: E1122 03:53:58.301187 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:54:07 crc kubenswrapper[4922]: I1122 03:54:07.804241 4922 scope.go:117] "RemoveContainer" containerID="dfc1b5b594bee7f44fe372cacbf5ee614e5400f7ae9f7868f07414417010e878" Nov 22 03:54:11 crc kubenswrapper[4922]: I1122 03:54:11.307363 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:54:11 crc kubenswrapper[4922]: E1122 03:54:11.309596 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:54:22 crc kubenswrapper[4922]: I1122 03:54:22.300563 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:54:22 crc kubenswrapper[4922]: E1122 03:54:22.301327 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:54:34 crc kubenswrapper[4922]: I1122 03:54:34.301166 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:54:34 crc kubenswrapper[4922]: E1122 03:54:34.302414 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.745241 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rpstp"] Nov 22 03:54:40 crc kubenswrapper[4922]: E1122 03:54:40.746504 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="registry-server" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.746522 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="registry-server" Nov 22 03:54:40 crc kubenswrapper[4922]: E1122 03:54:40.746551 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="extract-content" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.746559 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="extract-content" Nov 22 03:54:40 crc kubenswrapper[4922]: E1122 03:54:40.746579 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="extract-utilities" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.746586 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="extract-utilities" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.746837 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="059bc9c2-5ddd-4511-a74c-234277bb3b63" containerName="registry-server" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.751430 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.773205 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpstp"] Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.816861 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-utilities\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.816907 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-catalog-content\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.816956 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrv4w\" (UniqueName: \"kubernetes.io/projected/7e359541-9b94-4bb8-8726-527157f87923-kube-api-access-lrv4w\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.918667 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-utilities\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.919086 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-catalog-content\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.919184 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrv4w\" (UniqueName: \"kubernetes.io/projected/7e359541-9b94-4bb8-8726-527157f87923-kube-api-access-lrv4w\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.919784 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-catalog-content\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.920395 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-utilities\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:40 crc kubenswrapper[4922]: I1122 03:54:40.945817 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrv4w\" (UniqueName: \"kubernetes.io/projected/7e359541-9b94-4bb8-8726-527157f87923-kube-api-access-lrv4w\") pod \"certified-operators-rpstp\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:41 crc kubenswrapper[4922]: I1122 03:54:41.099946 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:41 crc kubenswrapper[4922]: I1122 03:54:41.613577 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rpstp"] Nov 22 03:54:42 crc kubenswrapper[4922]: I1122 03:54:42.349260 4922 generic.go:334] "Generic (PLEG): container finished" podID="7e359541-9b94-4bb8-8726-527157f87923" containerID="22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5" exitCode=0 Nov 22 03:54:42 crc kubenswrapper[4922]: I1122 03:54:42.349507 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerDied","Data":"22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5"} Nov 22 03:54:42 crc kubenswrapper[4922]: I1122 03:54:42.349998 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerStarted","Data":"3a546dec43a9b8e5e420cb47dc442a8c113bbc9d1c8263760d54477847a1ef17"} Nov 22 03:54:43 crc kubenswrapper[4922]: I1122 03:54:43.360405 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerStarted","Data":"3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7"} Nov 22 03:54:44 crc kubenswrapper[4922]: I1122 03:54:44.368423 4922 generic.go:334] "Generic (PLEG): container finished" podID="7e359541-9b94-4bb8-8726-527157f87923" containerID="3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7" exitCode=0 Nov 22 03:54:44 crc kubenswrapper[4922]: I1122 03:54:44.368626 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerDied","Data":"3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7"} Nov 22 03:54:47 crc kubenswrapper[4922]: I1122 03:54:47.402382 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerStarted","Data":"ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d"} Nov 22 03:54:47 crc kubenswrapper[4922]: I1122 03:54:47.433555 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rpstp" podStartSLOduration=3.366540806 podStartE2EDuration="7.433521591s" podCreationTimestamp="2025-11-22 03:54:40 +0000 UTC" firstStartedPulling="2025-11-22 03:54:42.351189353 +0000 UTC m=+3718.389711245" lastFinishedPulling="2025-11-22 03:54:46.418170108 +0000 UTC m=+3722.456692030" observedRunningTime="2025-11-22 03:54:47.43056584 +0000 UTC m=+3723.469087722" watchObservedRunningTime="2025-11-22 03:54:47.433521591 +0000 UTC m=+3723.472043523" Nov 22 03:54:49 crc kubenswrapper[4922]: I1122 03:54:49.300974 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:54:49 crc kubenswrapper[4922]: E1122 03:54:49.302919 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:54:51 crc kubenswrapper[4922]: I1122 03:54:51.101040 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:51 crc kubenswrapper[4922]: I1122 03:54:51.101548 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:51 crc kubenswrapper[4922]: I1122 03:54:51.195178 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:51 crc kubenswrapper[4922]: I1122 03:54:51.517822 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:51 crc kubenswrapper[4922]: I1122 03:54:51.565506 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpstp"] Nov 22 03:54:53 crc kubenswrapper[4922]: I1122 03:54:53.467496 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rpstp" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="registry-server" containerID="cri-o://ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d" gracePeriod=2 Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.114688 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.213383 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrv4w\" (UniqueName: \"kubernetes.io/projected/7e359541-9b94-4bb8-8726-527157f87923-kube-api-access-lrv4w\") pod \"7e359541-9b94-4bb8-8726-527157f87923\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.213472 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-catalog-content\") pod \"7e359541-9b94-4bb8-8726-527157f87923\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.213718 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-utilities\") pod \"7e359541-9b94-4bb8-8726-527157f87923\" (UID: \"7e359541-9b94-4bb8-8726-527157f87923\") " Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.215056 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-utilities" (OuterVolumeSpecName: "utilities") pod "7e359541-9b94-4bb8-8726-527157f87923" (UID: "7e359541-9b94-4bb8-8726-527157f87923"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.231579 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e359541-9b94-4bb8-8726-527157f87923-kube-api-access-lrv4w" (OuterVolumeSpecName: "kube-api-access-lrv4w") pod "7e359541-9b94-4bb8-8726-527157f87923" (UID: "7e359541-9b94-4bb8-8726-527157f87923"). InnerVolumeSpecName "kube-api-access-lrv4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.284917 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e359541-9b94-4bb8-8726-527157f87923" (UID: "7e359541-9b94-4bb8-8726-527157f87923"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.317502 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrv4w\" (UniqueName: \"kubernetes.io/projected/7e359541-9b94-4bb8-8726-527157f87923-kube-api-access-lrv4w\") on node \"crc\" DevicePath \"\"" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.317565 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.317584 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e359541-9b94-4bb8-8726-527157f87923-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.481618 4922 generic.go:334] "Generic (PLEG): container finished" podID="7e359541-9b94-4bb8-8726-527157f87923" containerID="ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d" exitCode=0 Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.481688 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerDied","Data":"ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d"} Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.481784 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rpstp" event={"ID":"7e359541-9b94-4bb8-8726-527157f87923","Type":"ContainerDied","Data":"3a546dec43a9b8e5e420cb47dc442a8c113bbc9d1c8263760d54477847a1ef17"} Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.481681 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rpstp" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.481827 4922 scope.go:117] "RemoveContainer" containerID="ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.512782 4922 scope.go:117] "RemoveContainer" containerID="3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.537286 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rpstp"] Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.550868 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rpstp"] Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.555623 4922 scope.go:117] "RemoveContainer" containerID="22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.588500 4922 scope.go:117] "RemoveContainer" containerID="ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d" Nov 22 03:54:54 crc kubenswrapper[4922]: E1122 03:54:54.588953 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d\": container with ID starting with ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d not found: ID does not exist" containerID="ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.588997 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d"} err="failed to get container status \"ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d\": rpc error: code = NotFound desc = could not find container \"ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d\": container with ID starting with ef15aadc184d331832267b4fc7241c002c59b4605629857518e0b5bda454e85d not found: ID does not exist" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.589023 4922 scope.go:117] "RemoveContainer" containerID="3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7" Nov 22 03:54:54 crc kubenswrapper[4922]: E1122 03:54:54.589443 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7\": container with ID starting with 3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7 not found: ID does not exist" containerID="3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.589477 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7"} err="failed to get container status \"3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7\": rpc error: code = NotFound desc = could not find container \"3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7\": container with ID starting with 3bf226a38abe91f86fc0e6cb59229085e79e8252c775b0d8e7b29bd3918cb0e7 not found: ID does not exist" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.589502 4922 scope.go:117] "RemoveContainer" containerID="22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5" Nov 22 03:54:54 crc kubenswrapper[4922]: E1122 03:54:54.589773 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5\": container with ID starting with 22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5 not found: ID does not exist" containerID="22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5" Nov 22 03:54:54 crc kubenswrapper[4922]: I1122 03:54:54.589797 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5"} err="failed to get container status \"22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5\": rpc error: code = NotFound desc = could not find container \"22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5\": container with ID starting with 22c37e7f2369cfd88a2077a9daf0e6f96b3695802bfdb8bfeaff64441a661be5 not found: ID does not exist" Nov 22 03:54:55 crc kubenswrapper[4922]: I1122 03:54:55.313064 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e359541-9b94-4bb8-8726-527157f87923" path="/var/lib/kubelet/pods/7e359541-9b94-4bb8-8726-527157f87923/volumes" Nov 22 03:55:02 crc kubenswrapper[4922]: I1122 03:55:02.300933 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:55:02 crc kubenswrapper[4922]: E1122 03:55:02.302253 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:55:13 crc kubenswrapper[4922]: I1122 03:55:13.301482 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:55:13 crc kubenswrapper[4922]: E1122 03:55:13.302484 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.950778 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6tsww"] Nov 22 03:55:24 crc kubenswrapper[4922]: E1122 03:55:24.951725 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="registry-server" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.951739 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="registry-server" Nov 22 03:55:24 crc kubenswrapper[4922]: E1122 03:55:24.951768 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="extract-utilities" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.951779 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="extract-utilities" Nov 22 03:55:24 crc kubenswrapper[4922]: E1122 03:55:24.951799 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="extract-content" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.951806 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="extract-content" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.952092 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e359541-9b94-4bb8-8726-527157f87923" containerName="registry-server" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.953875 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.978479 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tsww"] Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.992442 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-utilities\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.992672 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-catalog-content\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:24 crc kubenswrapper[4922]: I1122 03:55:24.992714 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwbxh\" (UniqueName: \"kubernetes.io/projected/871dd75e-e76d-468b-88e2-ce27f0faf1ec-kube-api-access-wwbxh\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.094132 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-utilities\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.094365 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-catalog-content\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.094433 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwbxh\" (UniqueName: \"kubernetes.io/projected/871dd75e-e76d-468b-88e2-ce27f0faf1ec-kube-api-access-wwbxh\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.094916 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-catalog-content\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.094915 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-utilities\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.118053 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwbxh\" (UniqueName: \"kubernetes.io/projected/871dd75e-e76d-468b-88e2-ce27f0faf1ec-kube-api-access-wwbxh\") pod \"redhat-marketplace-6tsww\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.273519 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.309692 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:55:25 crc kubenswrapper[4922]: E1122 03:55:25.309996 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.722700 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tsww"] Nov 22 03:55:25 crc kubenswrapper[4922]: W1122 03:55:25.732774 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871dd75e_e76d_468b_88e2_ce27f0faf1ec.slice/crio-1e6b724466ce2318cd25dc62f294de94eaf512801ccbf3114a82490d1d7ff938 WatchSource:0}: Error finding container 1e6b724466ce2318cd25dc62f294de94eaf512801ccbf3114a82490d1d7ff938: Status 404 returned error can't find the container with id 1e6b724466ce2318cd25dc62f294de94eaf512801ccbf3114a82490d1d7ff938 Nov 22 03:55:25 crc kubenswrapper[4922]: I1122 03:55:25.789271 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tsww" event={"ID":"871dd75e-e76d-468b-88e2-ce27f0faf1ec","Type":"ContainerStarted","Data":"1e6b724466ce2318cd25dc62f294de94eaf512801ccbf3114a82490d1d7ff938"} Nov 22 03:55:26 crc kubenswrapper[4922]: I1122 03:55:26.798873 4922 generic.go:334] "Generic (PLEG): container finished" podID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerID="08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706" exitCode=0 Nov 22 03:55:26 crc kubenswrapper[4922]: I1122 03:55:26.798969 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tsww" event={"ID":"871dd75e-e76d-468b-88e2-ce27f0faf1ec","Type":"ContainerDied","Data":"08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706"} Nov 22 03:55:28 crc kubenswrapper[4922]: I1122 03:55:28.820067 4922 generic.go:334] "Generic (PLEG): container finished" podID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerID="f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31" exitCode=0 Nov 22 03:55:28 crc kubenswrapper[4922]: I1122 03:55:28.820211 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tsww" event={"ID":"871dd75e-e76d-468b-88e2-ce27f0faf1ec","Type":"ContainerDied","Data":"f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31"} Nov 22 03:55:29 crc kubenswrapper[4922]: I1122 03:55:29.835255 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tsww" event={"ID":"871dd75e-e76d-468b-88e2-ce27f0faf1ec","Type":"ContainerStarted","Data":"906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228"} Nov 22 03:55:29 crc kubenswrapper[4922]: I1122 03:55:29.865413 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6tsww" podStartSLOduration=3.390799703 podStartE2EDuration="5.864824669s" podCreationTimestamp="2025-11-22 03:55:24 +0000 UTC" firstStartedPulling="2025-11-22 03:55:26.801106186 +0000 UTC m=+3762.839628078" lastFinishedPulling="2025-11-22 03:55:29.275131152 +0000 UTC m=+3765.313653044" observedRunningTime="2025-11-22 03:55:29.857167314 +0000 UTC m=+3765.895689206" watchObservedRunningTime="2025-11-22 03:55:29.864824669 +0000 UTC m=+3765.903346571" Nov 22 03:55:35 crc kubenswrapper[4922]: I1122 03:55:35.274381 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:35 crc kubenswrapper[4922]: I1122 03:55:35.275028 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:35 crc kubenswrapper[4922]: I1122 03:55:35.350542 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:35 crc kubenswrapper[4922]: I1122 03:55:35.995313 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:36 crc kubenswrapper[4922]: I1122 03:55:36.050877 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tsww"] Nov 22 03:55:37 crc kubenswrapper[4922]: I1122 03:55:37.939809 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6tsww" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="registry-server" containerID="cri-o://906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228" gracePeriod=2 Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.557391 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.688501 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwbxh\" (UniqueName: \"kubernetes.io/projected/871dd75e-e76d-468b-88e2-ce27f0faf1ec-kube-api-access-wwbxh\") pod \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.688588 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-utilities\") pod \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.688661 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-catalog-content\") pod \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\" (UID: \"871dd75e-e76d-468b-88e2-ce27f0faf1ec\") " Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.690072 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-utilities" (OuterVolumeSpecName: "utilities") pod "871dd75e-e76d-468b-88e2-ce27f0faf1ec" (UID: "871dd75e-e76d-468b-88e2-ce27f0faf1ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.704062 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871dd75e-e76d-468b-88e2-ce27f0faf1ec-kube-api-access-wwbxh" (OuterVolumeSpecName: "kube-api-access-wwbxh") pod "871dd75e-e76d-468b-88e2-ce27f0faf1ec" (UID: "871dd75e-e76d-468b-88e2-ce27f0faf1ec"). InnerVolumeSpecName "kube-api-access-wwbxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.791404 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwbxh\" (UniqueName: \"kubernetes.io/projected/871dd75e-e76d-468b-88e2-ce27f0faf1ec-kube-api-access-wwbxh\") on node \"crc\" DevicePath \"\"" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.791438 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.840815 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "871dd75e-e76d-468b-88e2-ce27f0faf1ec" (UID: "871dd75e-e76d-468b-88e2-ce27f0faf1ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.892824 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/871dd75e-e76d-468b-88e2-ce27f0faf1ec-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.950661 4922 generic.go:334] "Generic (PLEG): container finished" podID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerID="906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228" exitCode=0 Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.950700 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tsww" event={"ID":"871dd75e-e76d-468b-88e2-ce27f0faf1ec","Type":"ContainerDied","Data":"906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228"} Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.950728 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tsww" event={"ID":"871dd75e-e76d-468b-88e2-ce27f0faf1ec","Type":"ContainerDied","Data":"1e6b724466ce2318cd25dc62f294de94eaf512801ccbf3114a82490d1d7ff938"} Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.950747 4922 scope.go:117] "RemoveContainer" containerID="906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.950779 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tsww" Nov 22 03:55:38 crc kubenswrapper[4922]: I1122 03:55:38.985893 4922 scope.go:117] "RemoveContainer" containerID="f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.004585 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tsww"] Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.004644 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tsww"] Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.028149 4922 scope.go:117] "RemoveContainer" containerID="08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.075238 4922 scope.go:117] "RemoveContainer" containerID="906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228" Nov 22 03:55:39 crc kubenswrapper[4922]: E1122 03:55:39.075746 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228\": container with ID starting with 906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228 not found: ID does not exist" containerID="906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.075776 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228"} err="failed to get container status \"906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228\": rpc error: code = NotFound desc = could not find container \"906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228\": container with ID starting with 906af906b8bf1e99a052426c1e7025991ed70638846a90f14a0067af1e611228 not found: ID does not exist" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.075798 4922 scope.go:117] "RemoveContainer" containerID="f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31" Nov 22 03:55:39 crc kubenswrapper[4922]: E1122 03:55:39.076128 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31\": container with ID starting with f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31 not found: ID does not exist" containerID="f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.076171 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31"} err="failed to get container status \"f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31\": rpc error: code = NotFound desc = could not find container \"f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31\": container with ID starting with f2f4525c1fcf35d944d2692c3e08bd461009659d534cc12c297f49e9494a4b31 not found: ID does not exist" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.076202 4922 scope.go:117] "RemoveContainer" containerID="08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706" Nov 22 03:55:39 crc kubenswrapper[4922]: E1122 03:55:39.077172 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706\": container with ID starting with 08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706 not found: ID does not exist" containerID="08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.077196 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706"} err="failed to get container status \"08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706\": rpc error: code = NotFound desc = could not find container \"08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706\": container with ID starting with 08b89894e5f5b9ef91723eaf3329fc33660184362a46137238f4f4d8a374f706 not found: ID does not exist" Nov 22 03:55:39 crc kubenswrapper[4922]: I1122 03:55:39.315820 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" path="/var/lib/kubelet/pods/871dd75e-e76d-468b-88e2-ce27f0faf1ec/volumes" Nov 22 03:55:40 crc kubenswrapper[4922]: I1122 03:55:40.300897 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:55:40 crc kubenswrapper[4922]: E1122 03:55:40.301479 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:55:55 crc kubenswrapper[4922]: I1122 03:55:55.308607 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:55:55 crc kubenswrapper[4922]: E1122 03:55:55.309649 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:56:09 crc kubenswrapper[4922]: I1122 03:56:09.302408 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:56:09 crc kubenswrapper[4922]: E1122 03:56:09.303383 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:56:23 crc kubenswrapper[4922]: I1122 03:56:23.300828 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:56:23 crc kubenswrapper[4922]: E1122 03:56:23.302333 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:56:36 crc kubenswrapper[4922]: I1122 03:56:36.301375 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:56:36 crc kubenswrapper[4922]: E1122 03:56:36.302458 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:56:47 crc kubenswrapper[4922]: I1122 03:56:47.301440 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:56:47 crc kubenswrapper[4922]: E1122 03:56:47.302704 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:57:00 crc kubenswrapper[4922]: I1122 03:57:00.301167 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:57:00 crc kubenswrapper[4922]: E1122 03:57:00.302757 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 03:57:14 crc kubenswrapper[4922]: I1122 03:57:14.300606 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 03:57:15 crc kubenswrapper[4922]: I1122 03:57:15.150646 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"4f0f1e3813219a10c0f92c69597630f930905fec7c9f7da18bc165f69f875281"} Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.698032 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2x2rd"] Nov 22 03:57:16 crc kubenswrapper[4922]: E1122 03:57:16.699120 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="extract-utilities" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.699136 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="extract-utilities" Nov 22 03:57:16 crc kubenswrapper[4922]: E1122 03:57:16.699150 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="registry-server" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.699158 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="registry-server" Nov 22 03:57:16 crc kubenswrapper[4922]: E1122 03:57:16.699191 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="extract-content" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.699200 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="extract-content" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.699619 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="871dd75e-e76d-468b-88e2-ce27f0faf1ec" containerName="registry-server" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.701373 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.713276 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2x2rd"] Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.805117 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-catalog-content\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.805204 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-utilities\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.805379 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6gw\" (UniqueName: \"kubernetes.io/projected/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-kube-api-access-kh6gw\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.906590 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6gw\" (UniqueName: \"kubernetes.io/projected/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-kube-api-access-kh6gw\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.906706 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-catalog-content\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.906736 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-utilities\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.907285 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-utilities\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.907346 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-catalog-content\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:16 crc kubenswrapper[4922]: I1122 03:57:16.927566 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6gw\" (UniqueName: \"kubernetes.io/projected/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-kube-api-access-kh6gw\") pod \"redhat-operators-2x2rd\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:17 crc kubenswrapper[4922]: I1122 03:57:17.034269 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:17 crc kubenswrapper[4922]: I1122 03:57:17.512560 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2x2rd"] Nov 22 03:57:18 crc kubenswrapper[4922]: I1122 03:57:18.188141 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerID="98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a" exitCode=0 Nov 22 03:57:18 crc kubenswrapper[4922]: I1122 03:57:18.188282 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2x2rd" event={"ID":"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53","Type":"ContainerDied","Data":"98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a"} Nov 22 03:57:18 crc kubenswrapper[4922]: I1122 03:57:18.188945 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2x2rd" event={"ID":"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53","Type":"ContainerStarted","Data":"25ced1bc856454ad224774841c188a11dbb7038adff3abce052c1c8dbc8f448d"} Nov 22 03:57:20 crc kubenswrapper[4922]: I1122 03:57:20.208783 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerID="46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90" exitCode=0 Nov 22 03:57:20 crc kubenswrapper[4922]: I1122 03:57:20.208884 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2x2rd" event={"ID":"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53","Type":"ContainerDied","Data":"46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90"} Nov 22 03:57:21 crc kubenswrapper[4922]: I1122 03:57:21.222668 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2x2rd" event={"ID":"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53","Type":"ContainerStarted","Data":"2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3"} Nov 22 03:57:21 crc kubenswrapper[4922]: I1122 03:57:21.255148 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2x2rd" podStartSLOduration=2.820356742 podStartE2EDuration="5.255127402s" podCreationTimestamp="2025-11-22 03:57:16 +0000 UTC" firstStartedPulling="2025-11-22 03:57:18.195293292 +0000 UTC m=+3874.233815214" lastFinishedPulling="2025-11-22 03:57:20.630063982 +0000 UTC m=+3876.668585874" observedRunningTime="2025-11-22 03:57:21.239484895 +0000 UTC m=+3877.278006817" watchObservedRunningTime="2025-11-22 03:57:21.255127402 +0000 UTC m=+3877.293649314" Nov 22 03:57:27 crc kubenswrapper[4922]: I1122 03:57:27.034942 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:27 crc kubenswrapper[4922]: I1122 03:57:27.035583 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:28 crc kubenswrapper[4922]: I1122 03:57:28.128143 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2x2rd" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="registry-server" probeResult="failure" output=< Nov 22 03:57:28 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 03:57:28 crc kubenswrapper[4922]: > Nov 22 03:57:37 crc kubenswrapper[4922]: I1122 03:57:37.733670 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:37 crc kubenswrapper[4922]: I1122 03:57:37.787099 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:37 crc kubenswrapper[4922]: I1122 03:57:37.973042 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2x2rd"] Nov 22 03:57:39 crc kubenswrapper[4922]: I1122 03:57:39.428123 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2x2rd" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="registry-server" containerID="cri-o://2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3" gracePeriod=2 Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.066257 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.173554 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-catalog-content\") pod \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.173765 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh6gw\" (UniqueName: \"kubernetes.io/projected/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-kube-api-access-kh6gw\") pod \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.173794 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-utilities\") pod \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\" (UID: \"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53\") " Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.174697 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-utilities" (OuterVolumeSpecName: "utilities") pod "2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" (UID: "2a1f43f0-f69f-4aa0-86c8-6a408f4edf53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.180501 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-kube-api-access-kh6gw" (OuterVolumeSpecName: "kube-api-access-kh6gw") pod "2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" (UID: "2a1f43f0-f69f-4aa0-86c8-6a408f4edf53"). InnerVolumeSpecName "kube-api-access-kh6gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.261619 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" (UID: "2a1f43f0-f69f-4aa0-86c8-6a408f4edf53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.276945 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.276993 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh6gw\" (UniqueName: \"kubernetes.io/projected/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-kube-api-access-kh6gw\") on node \"crc\" DevicePath \"\"" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.277013 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.439081 4922 generic.go:334] "Generic (PLEG): container finished" podID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerID="2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3" exitCode=0 Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.439137 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2x2rd" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.439134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2x2rd" event={"ID":"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53","Type":"ContainerDied","Data":"2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3"} Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.439322 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2x2rd" event={"ID":"2a1f43f0-f69f-4aa0-86c8-6a408f4edf53","Type":"ContainerDied","Data":"25ced1bc856454ad224774841c188a11dbb7038adff3abce052c1c8dbc8f448d"} Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.439360 4922 scope.go:117] "RemoveContainer" containerID="2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.474614 4922 scope.go:117] "RemoveContainer" containerID="46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.480066 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2x2rd"] Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.491259 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2x2rd"] Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.530717 4922 scope.go:117] "RemoveContainer" containerID="98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.571221 4922 scope.go:117] "RemoveContainer" containerID="2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3" Nov 22 03:57:40 crc kubenswrapper[4922]: E1122 03:57:40.571766 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3\": container with ID starting with 2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3 not found: ID does not exist" containerID="2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.571797 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3"} err="failed to get container status \"2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3\": rpc error: code = NotFound desc = could not find container \"2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3\": container with ID starting with 2b2078b052b495232a10560eec86f631cf3fd31581eb45a72a112048311472f3 not found: ID does not exist" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.571817 4922 scope.go:117] "RemoveContainer" containerID="46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90" Nov 22 03:57:40 crc kubenswrapper[4922]: E1122 03:57:40.572380 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90\": container with ID starting with 46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90 not found: ID does not exist" containerID="46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.572401 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90"} err="failed to get container status \"46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90\": rpc error: code = NotFound desc = could not find container \"46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90\": container with ID starting with 46fe30b335c1a9bac75aeae29bf9e16ad583632d0daf855efbef3583ce1b6b90 not found: ID does not exist" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.572413 4922 scope.go:117] "RemoveContainer" containerID="98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a" Nov 22 03:57:40 crc kubenswrapper[4922]: E1122 03:57:40.573098 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a\": container with ID starting with 98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a not found: ID does not exist" containerID="98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a" Nov 22 03:57:40 crc kubenswrapper[4922]: I1122 03:57:40.573173 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a"} err="failed to get container status \"98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a\": rpc error: code = NotFound desc = could not find container \"98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a\": container with ID starting with 98df73ed53f4ad716ee313afd243745b54cadc49f693636a296cfc91133b738a not found: ID does not exist" Nov 22 03:57:41 crc kubenswrapper[4922]: I1122 03:57:41.319609 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" path="/var/lib/kubelet/pods/2a1f43f0-f69f-4aa0-86c8-6a408f4edf53/volumes" Nov 22 03:59:41 crc kubenswrapper[4922]: I1122 03:59:41.109629 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 03:59:41 crc kubenswrapper[4922]: I1122 03:59:41.110256 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.156391 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7"] Nov 22 04:00:00 crc kubenswrapper[4922]: E1122 04:00:00.157609 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="extract-content" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.157629 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="extract-content" Nov 22 04:00:00 crc kubenswrapper[4922]: E1122 04:00:00.157685 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="extract-utilities" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.157698 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="extract-utilities" Nov 22 04:00:00 crc kubenswrapper[4922]: E1122 04:00:00.157718 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="registry-server" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.157731 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="registry-server" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.158105 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1f43f0-f69f-4aa0-86c8-6a408f4edf53" containerName="registry-server" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.159088 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.161309 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.165065 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.181628 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7"] Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.207690 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-config-volume\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.207765 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-secret-volume\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.208003 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvv4w\" (UniqueName: \"kubernetes.io/projected/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-kube-api-access-pvv4w\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.309670 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvv4w\" (UniqueName: \"kubernetes.io/projected/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-kube-api-access-pvv4w\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.309966 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-config-volume\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.310001 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-secret-volume\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.311201 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-config-volume\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.330678 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-secret-volume\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.339936 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvv4w\" (UniqueName: \"kubernetes.io/projected/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-kube-api-access-pvv4w\") pod \"collect-profiles-29396400-gcxw7\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.486275 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:00 crc kubenswrapper[4922]: I1122 04:00:00.981876 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7"] Nov 22 04:00:01 crc kubenswrapper[4922]: I1122 04:00:01.197181 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" event={"ID":"40960587-9fe0-4910-8c2c-a5ae8b72cf9b","Type":"ContainerStarted","Data":"e9acb1737dc34dc6f3e846a81a0e47fcea3d3e8666d9b01961dc9819b8debaa1"} Nov 22 04:00:01 crc kubenswrapper[4922]: I1122 04:00:01.197247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" event={"ID":"40960587-9fe0-4910-8c2c-a5ae8b72cf9b","Type":"ContainerStarted","Data":"a97890b9c2bb245ee3864c33aed4077a2519085be75c7a6151c041855b7f0994"} Nov 22 04:00:01 crc kubenswrapper[4922]: I1122 04:00:01.220739 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" podStartSLOduration=1.220721057 podStartE2EDuration="1.220721057s" podCreationTimestamp="2025-11-22 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:00:01.212248133 +0000 UTC m=+4037.250770055" watchObservedRunningTime="2025-11-22 04:00:01.220721057 +0000 UTC m=+4037.259242959" Nov 22 04:00:02 crc kubenswrapper[4922]: I1122 04:00:02.213444 4922 generic.go:334] "Generic (PLEG): container finished" podID="40960587-9fe0-4910-8c2c-a5ae8b72cf9b" containerID="e9acb1737dc34dc6f3e846a81a0e47fcea3d3e8666d9b01961dc9819b8debaa1" exitCode=0 Nov 22 04:00:02 crc kubenswrapper[4922]: I1122 04:00:02.213584 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" event={"ID":"40960587-9fe0-4910-8c2c-a5ae8b72cf9b","Type":"ContainerDied","Data":"e9acb1737dc34dc6f3e846a81a0e47fcea3d3e8666d9b01961dc9819b8debaa1"} Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.596318 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.690983 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvv4w\" (UniqueName: \"kubernetes.io/projected/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-kube-api-access-pvv4w\") pod \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.691079 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-config-volume\") pod \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.691248 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-secret-volume\") pod \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\" (UID: \"40960587-9fe0-4910-8c2c-a5ae8b72cf9b\") " Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.693799 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "40960587-9fe0-4910-8c2c-a5ae8b72cf9b" (UID: "40960587-9fe0-4910-8c2c-a5ae8b72cf9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.698934 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-kube-api-access-pvv4w" (OuterVolumeSpecName: "kube-api-access-pvv4w") pod "40960587-9fe0-4910-8c2c-a5ae8b72cf9b" (UID: "40960587-9fe0-4910-8c2c-a5ae8b72cf9b"). InnerVolumeSpecName "kube-api-access-pvv4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.699229 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40960587-9fe0-4910-8c2c-a5ae8b72cf9b" (UID: "40960587-9fe0-4910-8c2c-a5ae8b72cf9b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.792620 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.792651 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvv4w\" (UniqueName: \"kubernetes.io/projected/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-kube-api-access-pvv4w\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:03 crc kubenswrapper[4922]: I1122 04:00:03.792661 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40960587-9fe0-4910-8c2c-a5ae8b72cf9b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:00:04 crc kubenswrapper[4922]: I1122 04:00:04.256318 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" event={"ID":"40960587-9fe0-4910-8c2c-a5ae8b72cf9b","Type":"ContainerDied","Data":"a97890b9c2bb245ee3864c33aed4077a2519085be75c7a6151c041855b7f0994"} Nov 22 04:00:04 crc kubenswrapper[4922]: I1122 04:00:04.256356 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97890b9c2bb245ee3864c33aed4077a2519085be75c7a6151c041855b7f0994" Nov 22 04:00:04 crc kubenswrapper[4922]: I1122 04:00:04.256414 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-gcxw7" Nov 22 04:00:04 crc kubenswrapper[4922]: I1122 04:00:04.317196 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws"] Nov 22 04:00:04 crc kubenswrapper[4922]: I1122 04:00:04.327360 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396355-956ws"] Nov 22 04:00:05 crc kubenswrapper[4922]: I1122 04:00:05.316500 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47eee278-a04a-4e50-98f2-98db1b7caa21" path="/var/lib/kubelet/pods/47eee278-a04a-4e50-98f2-98db1b7caa21/volumes" Nov 22 04:00:08 crc kubenswrapper[4922]: I1122 04:00:08.086725 4922 scope.go:117] "RemoveContainer" containerID="e3523d1601cfec04fa711b7fb5b039fd644edf4780a65d22606723f2d1f94df2" Nov 22 04:00:11 crc kubenswrapper[4922]: I1122 04:00:11.109480 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:00:11 crc kubenswrapper[4922]: I1122 04:00:11.109985 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.109309 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.109978 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.110025 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.110831 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f0f1e3813219a10c0f92c69597630f930905fec7c9f7da18bc165f69f875281"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.110952 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://4f0f1e3813219a10c0f92c69597630f930905fec7c9f7da18bc165f69f875281" gracePeriod=600 Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.636287 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="4f0f1e3813219a10c0f92c69597630f930905fec7c9f7da18bc165f69f875281" exitCode=0 Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.636396 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"4f0f1e3813219a10c0f92c69597630f930905fec7c9f7da18bc165f69f875281"} Nov 22 04:00:41 crc kubenswrapper[4922]: I1122 04:00:41.637315 4922 scope.go:117] "RemoveContainer" containerID="e24886a4c30052c6f5e85a3aef80b4e9d1b9b35cdb2711ab03550c54058c1855" Nov 22 04:00:42 crc kubenswrapper[4922]: I1122 04:00:42.655257 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2"} Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.176585 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29396401-4fvwk"] Nov 22 04:01:00 crc kubenswrapper[4922]: E1122 04:01:00.178092 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40960587-9fe0-4910-8c2c-a5ae8b72cf9b" containerName="collect-profiles" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.178119 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="40960587-9fe0-4910-8c2c-a5ae8b72cf9b" containerName="collect-profiles" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.178395 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="40960587-9fe0-4910-8c2c-a5ae8b72cf9b" containerName="collect-profiles" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.179304 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.189728 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396401-4fvwk"] Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.340764 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-fernet-keys\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.341040 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-config-data\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.341450 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2vtq\" (UniqueName: \"kubernetes.io/projected/dd803d93-aec0-495c-888c-69bd472ee7b6-kube-api-access-d2vtq\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.341512 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-combined-ca-bundle\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.444323 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-fernet-keys\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.444417 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-config-data\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.445741 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2vtq\" (UniqueName: \"kubernetes.io/projected/dd803d93-aec0-495c-888c-69bd472ee7b6-kube-api-access-d2vtq\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.446044 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-combined-ca-bundle\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.452179 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-fernet-keys\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.452693 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-combined-ca-bundle\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.462360 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2vtq\" (UniqueName: \"kubernetes.io/projected/dd803d93-aec0-495c-888c-69bd472ee7b6-kube-api-access-d2vtq\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.476205 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-config-data\") pod \"keystone-cron-29396401-4fvwk\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.513279 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:00 crc kubenswrapper[4922]: I1122 04:01:00.954761 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396401-4fvwk"] Nov 22 04:01:01 crc kubenswrapper[4922]: I1122 04:01:01.881530 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-4fvwk" event={"ID":"dd803d93-aec0-495c-888c-69bd472ee7b6","Type":"ContainerStarted","Data":"bb82f4a396a5c94f3d5b4d7f9bf95e7e4a9b0619cb427a7ae6386069e19c674f"} Nov 22 04:01:01 crc kubenswrapper[4922]: I1122 04:01:01.882274 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-4fvwk" event={"ID":"dd803d93-aec0-495c-888c-69bd472ee7b6","Type":"ContainerStarted","Data":"1f4a8d69fd04f971d6322d62757a4058ea24f3bbd681ab8a76ce97f920920c0c"} Nov 22 04:01:01 crc kubenswrapper[4922]: I1122 04:01:01.903442 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29396401-4fvwk" podStartSLOduration=1.903405486 podStartE2EDuration="1.903405486s" podCreationTimestamp="2025-11-22 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:01:01.897584736 +0000 UTC m=+4097.936106688" watchObservedRunningTime="2025-11-22 04:01:01.903405486 +0000 UTC m=+4097.941927398" Nov 22 04:01:04 crc kubenswrapper[4922]: I1122 04:01:04.912634 4922 generic.go:334] "Generic (PLEG): container finished" podID="dd803d93-aec0-495c-888c-69bd472ee7b6" containerID="bb82f4a396a5c94f3d5b4d7f9bf95e7e4a9b0619cb427a7ae6386069e19c674f" exitCode=0 Nov 22 04:01:04 crc kubenswrapper[4922]: I1122 04:01:04.912762 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-4fvwk" event={"ID":"dd803d93-aec0-495c-888c-69bd472ee7b6","Type":"ContainerDied","Data":"bb82f4a396a5c94f3d5b4d7f9bf95e7e4a9b0619cb427a7ae6386069e19c674f"} Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.464297 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.583765 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-config-data\") pod \"dd803d93-aec0-495c-888c-69bd472ee7b6\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.583857 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-fernet-keys\") pod \"dd803d93-aec0-495c-888c-69bd472ee7b6\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.583924 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2vtq\" (UniqueName: \"kubernetes.io/projected/dd803d93-aec0-495c-888c-69bd472ee7b6-kube-api-access-d2vtq\") pod \"dd803d93-aec0-495c-888c-69bd472ee7b6\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.584068 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-combined-ca-bundle\") pod \"dd803d93-aec0-495c-888c-69bd472ee7b6\" (UID: \"dd803d93-aec0-495c-888c-69bd472ee7b6\") " Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.589681 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd803d93-aec0-495c-888c-69bd472ee7b6-kube-api-access-d2vtq" (OuterVolumeSpecName: "kube-api-access-d2vtq") pod "dd803d93-aec0-495c-888c-69bd472ee7b6" (UID: "dd803d93-aec0-495c-888c-69bd472ee7b6"). InnerVolumeSpecName "kube-api-access-d2vtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.601199 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd803d93-aec0-495c-888c-69bd472ee7b6" (UID: "dd803d93-aec0-495c-888c-69bd472ee7b6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.616377 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd803d93-aec0-495c-888c-69bd472ee7b6" (UID: "dd803d93-aec0-495c-888c-69bd472ee7b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.639057 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-config-data" (OuterVolumeSpecName: "config-data") pod "dd803d93-aec0-495c-888c-69bd472ee7b6" (UID: "dd803d93-aec0-495c-888c-69bd472ee7b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.686435 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.686867 4922 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.686881 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2vtq\" (UniqueName: \"kubernetes.io/projected/dd803d93-aec0-495c-888c-69bd472ee7b6-kube-api-access-d2vtq\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.686891 4922 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd803d93-aec0-495c-888c-69bd472ee7b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.965516 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396401-4fvwk" event={"ID":"dd803d93-aec0-495c-888c-69bd472ee7b6","Type":"ContainerDied","Data":"1f4a8d69fd04f971d6322d62757a4058ea24f3bbd681ab8a76ce97f920920c0c"} Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.965574 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4a8d69fd04f971d6322d62757a4058ea24f3bbd681ab8a76ce97f920920c0c" Nov 22 04:01:06 crc kubenswrapper[4922]: I1122 04:01:06.965671 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396401-4fvwk" Nov 22 04:02:41 crc kubenswrapper[4922]: I1122 04:02:41.109993 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:02:41 crc kubenswrapper[4922]: I1122 04:02:41.110893 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:03:11 crc kubenswrapper[4922]: I1122 04:03:11.109171 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:03:11 crc kubenswrapper[4922]: I1122 04:03:11.109877 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.528573 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v52zk"] Nov 22 04:03:34 crc kubenswrapper[4922]: E1122 04:03:34.529784 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd803d93-aec0-495c-888c-69bd472ee7b6" containerName="keystone-cron" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.529805 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd803d93-aec0-495c-888c-69bd472ee7b6" containerName="keystone-cron" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.530202 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd803d93-aec0-495c-888c-69bd472ee7b6" containerName="keystone-cron" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.532481 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.568281 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v52zk"] Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.652717 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-catalog-content\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.652977 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-utilities\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.653010 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpx2\" (UniqueName: \"kubernetes.io/projected/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-kube-api-access-dfpx2\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.754967 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-utilities\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.755036 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpx2\" (UniqueName: \"kubernetes.io/projected/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-kube-api-access-dfpx2\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.755098 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-catalog-content\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.755796 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-catalog-content\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.756113 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-utilities\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:34 crc kubenswrapper[4922]: I1122 04:03:34.930046 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpx2\" (UniqueName: \"kubernetes.io/projected/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-kube-api-access-dfpx2\") pod \"community-operators-v52zk\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:35 crc kubenswrapper[4922]: I1122 04:03:35.162553 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:35 crc kubenswrapper[4922]: I1122 04:03:35.677085 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v52zk"] Nov 22 04:03:35 crc kubenswrapper[4922]: W1122 04:03:35.727371 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode99fbf0c_6139_4a66_8f59_1b6edb1d095f.slice/crio-8a0583a4a77c5d1a9afb2619ec55e6ad207b7f230b70767c59557d2a74a6be18 WatchSource:0}: Error finding container 8a0583a4a77c5d1a9afb2619ec55e6ad207b7f230b70767c59557d2a74a6be18: Status 404 returned error can't find the container with id 8a0583a4a77c5d1a9afb2619ec55e6ad207b7f230b70767c59557d2a74a6be18 Nov 22 04:03:36 crc kubenswrapper[4922]: I1122 04:03:36.638343 4922 generic.go:334] "Generic (PLEG): container finished" podID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerID="375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34" exitCode=0 Nov 22 04:03:36 crc kubenswrapper[4922]: I1122 04:03:36.638579 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v52zk" event={"ID":"e99fbf0c-6139-4a66-8f59-1b6edb1d095f","Type":"ContainerDied","Data":"375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34"} Nov 22 04:03:36 crc kubenswrapper[4922]: I1122 04:03:36.638902 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v52zk" event={"ID":"e99fbf0c-6139-4a66-8f59-1b6edb1d095f","Type":"ContainerStarted","Data":"8a0583a4a77c5d1a9afb2619ec55e6ad207b7f230b70767c59557d2a74a6be18"} Nov 22 04:03:36 crc kubenswrapper[4922]: I1122 04:03:36.641964 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:03:38 crc kubenswrapper[4922]: I1122 04:03:38.663267 4922 generic.go:334] "Generic (PLEG): container finished" podID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerID="015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d" exitCode=0 Nov 22 04:03:38 crc kubenswrapper[4922]: I1122 04:03:38.663336 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v52zk" event={"ID":"e99fbf0c-6139-4a66-8f59-1b6edb1d095f","Type":"ContainerDied","Data":"015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d"} Nov 22 04:03:40 crc kubenswrapper[4922]: I1122 04:03:40.684258 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v52zk" event={"ID":"e99fbf0c-6139-4a66-8f59-1b6edb1d095f","Type":"ContainerStarted","Data":"2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773"} Nov 22 04:03:40 crc kubenswrapper[4922]: I1122 04:03:40.719671 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v52zk" podStartSLOduration=3.845707487 podStartE2EDuration="6.719641668s" podCreationTimestamp="2025-11-22 04:03:34 +0000 UTC" firstStartedPulling="2025-11-22 04:03:36.641683879 +0000 UTC m=+4252.680205791" lastFinishedPulling="2025-11-22 04:03:39.51561808 +0000 UTC m=+4255.554139972" observedRunningTime="2025-11-22 04:03:40.70269189 +0000 UTC m=+4256.741213782" watchObservedRunningTime="2025-11-22 04:03:40.719641668 +0000 UTC m=+4256.758163600" Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.110177 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.110553 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.110595 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.111272 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.111325 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" gracePeriod=600 Nov 22 04:03:41 crc kubenswrapper[4922]: E1122 04:03:41.259214 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.698254 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" exitCode=0 Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.698342 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2"} Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.698627 4922 scope.go:117] "RemoveContainer" containerID="4f0f1e3813219a10c0f92c69597630f930905fec7c9f7da18bc165f69f875281" Nov 22 04:03:41 crc kubenswrapper[4922]: I1122 04:03:41.699620 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:03:41 crc kubenswrapper[4922]: E1122 04:03:41.700184 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:03:45 crc kubenswrapper[4922]: I1122 04:03:45.163022 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:45 crc kubenswrapper[4922]: I1122 04:03:45.163934 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:45 crc kubenswrapper[4922]: I1122 04:03:45.231083 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:45 crc kubenswrapper[4922]: I1122 04:03:45.788405 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:45 crc kubenswrapper[4922]: I1122 04:03:45.852160 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v52zk"] Nov 22 04:03:47 crc kubenswrapper[4922]: I1122 04:03:47.759059 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v52zk" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="registry-server" containerID="cri-o://2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773" gracePeriod=2 Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.483661 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.557065 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-catalog-content\") pod \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.557249 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-utilities\") pod \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.557650 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfpx2\" (UniqueName: \"kubernetes.io/projected/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-kube-api-access-dfpx2\") pod \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\" (UID: \"e99fbf0c-6139-4a66-8f59-1b6edb1d095f\") " Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.568123 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-kube-api-access-dfpx2" (OuterVolumeSpecName: "kube-api-access-dfpx2") pod "e99fbf0c-6139-4a66-8f59-1b6edb1d095f" (UID: "e99fbf0c-6139-4a66-8f59-1b6edb1d095f"). InnerVolumeSpecName "kube-api-access-dfpx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.568979 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-utilities" (OuterVolumeSpecName: "utilities") pod "e99fbf0c-6139-4a66-8f59-1b6edb1d095f" (UID: "e99fbf0c-6139-4a66-8f59-1b6edb1d095f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.625282 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e99fbf0c-6139-4a66-8f59-1b6edb1d095f" (UID: "e99fbf0c-6139-4a66-8f59-1b6edb1d095f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.659955 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.659999 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfpx2\" (UniqueName: \"kubernetes.io/projected/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-kube-api-access-dfpx2\") on node \"crc\" DevicePath \"\"" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.660014 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99fbf0c-6139-4a66-8f59-1b6edb1d095f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.769956 4922 generic.go:334] "Generic (PLEG): container finished" podID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerID="2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773" exitCode=0 Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.770173 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v52zk" event={"ID":"e99fbf0c-6139-4a66-8f59-1b6edb1d095f","Type":"ContainerDied","Data":"2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773"} Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.770553 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v52zk" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.770572 4922 scope.go:117] "RemoveContainer" containerID="2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.770556 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v52zk" event={"ID":"e99fbf0c-6139-4a66-8f59-1b6edb1d095f","Type":"ContainerDied","Data":"8a0583a4a77c5d1a9afb2619ec55e6ad207b7f230b70767c59557d2a74a6be18"} Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.813921 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v52zk"] Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.822865 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v52zk"] Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.830344 4922 scope.go:117] "RemoveContainer" containerID="015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.854662 4922 scope.go:117] "RemoveContainer" containerID="375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.899340 4922 scope.go:117] "RemoveContainer" containerID="2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773" Nov 22 04:03:48 crc kubenswrapper[4922]: E1122 04:03:48.899801 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773\": container with ID starting with 2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773 not found: ID does not exist" containerID="2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.899871 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773"} err="failed to get container status \"2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773\": rpc error: code = NotFound desc = could not find container \"2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773\": container with ID starting with 2fdcf7902526dc402243c69706d8db8e28ef39637148e03585a1b7b714150773 not found: ID does not exist" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.899903 4922 scope.go:117] "RemoveContainer" containerID="015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d" Nov 22 04:03:48 crc kubenswrapper[4922]: E1122 04:03:48.900389 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d\": container with ID starting with 015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d not found: ID does not exist" containerID="015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.900429 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d"} err="failed to get container status \"015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d\": rpc error: code = NotFound desc = could not find container \"015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d\": container with ID starting with 015ef451d129dbf529c19eb02b37eec5395a5c7e80429ce0b28ee7df8fda554d not found: ID does not exist" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.900447 4922 scope.go:117] "RemoveContainer" containerID="375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34" Nov 22 04:03:48 crc kubenswrapper[4922]: E1122 04:03:48.900687 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34\": container with ID starting with 375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34 not found: ID does not exist" containerID="375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34" Nov 22 04:03:48 crc kubenswrapper[4922]: I1122 04:03:48.900716 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34"} err="failed to get container status \"375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34\": rpc error: code = NotFound desc = could not find container \"375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34\": container with ID starting with 375208a8053f951ded64c56621a22cfee82442295ff6d224a3825ff7168e8d34 not found: ID does not exist" Nov 22 04:03:49 crc kubenswrapper[4922]: I1122 04:03:49.318538 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" path="/var/lib/kubelet/pods/e99fbf0c-6139-4a66-8f59-1b6edb1d095f/volumes" Nov 22 04:03:53 crc kubenswrapper[4922]: I1122 04:03:53.300630 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:03:53 crc kubenswrapper[4922]: E1122 04:03:53.302075 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:04:06 crc kubenswrapper[4922]: I1122 04:04:06.300566 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:04:06 crc kubenswrapper[4922]: E1122 04:04:06.301606 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:04:20 crc kubenswrapper[4922]: I1122 04:04:20.301145 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:04:20 crc kubenswrapper[4922]: E1122 04:04:20.303684 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:04:33 crc kubenswrapper[4922]: I1122 04:04:33.301134 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:04:33 crc kubenswrapper[4922]: E1122 04:04:33.302216 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:04:45 crc kubenswrapper[4922]: I1122 04:04:45.308719 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:04:45 crc kubenswrapper[4922]: E1122 04:04:45.309758 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:04:57 crc kubenswrapper[4922]: I1122 04:04:57.300766 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:04:57 crc kubenswrapper[4922]: E1122 04:04:57.302363 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:05:08 crc kubenswrapper[4922]: I1122 04:05:08.300239 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:05:08 crc kubenswrapper[4922]: E1122 04:05:08.301269 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.750518 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6b25"] Nov 22 04:05:11 crc kubenswrapper[4922]: E1122 04:05:11.752346 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="registry-server" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.752367 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="registry-server" Nov 22 04:05:11 crc kubenswrapper[4922]: E1122 04:05:11.752395 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="extract-utilities" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.752404 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="extract-utilities" Nov 22 04:05:11 crc kubenswrapper[4922]: E1122 04:05:11.752425 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="extract-content" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.752436 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="extract-content" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.752706 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99fbf0c-6139-4a66-8f59-1b6edb1d095f" containerName="registry-server" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.756693 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.787509 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6b25"] Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.848448 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rqh\" (UniqueName: \"kubernetes.io/projected/5a00fb19-f517-4a06-9e9d-0f8389c5f686-kube-api-access-p5rqh\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.848911 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-catalog-content\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.848974 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-utilities\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.951540 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-catalog-content\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.951644 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-utilities\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.951699 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rqh\" (UniqueName: \"kubernetes.io/projected/5a00fb19-f517-4a06-9e9d-0f8389c5f686-kube-api-access-p5rqh\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.952348 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-catalog-content\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.952405 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-utilities\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:11 crc kubenswrapper[4922]: I1122 04:05:11.988357 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rqh\" (UniqueName: \"kubernetes.io/projected/5a00fb19-f517-4a06-9e9d-0f8389c5f686-kube-api-access-p5rqh\") pod \"certified-operators-k6b25\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:12 crc kubenswrapper[4922]: I1122 04:05:12.099037 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:12 crc kubenswrapper[4922]: I1122 04:05:12.586298 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6b25"] Nov 22 04:05:12 crc kubenswrapper[4922]: W1122 04:05:12.589754 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a00fb19_f517_4a06_9e9d_0f8389c5f686.slice/crio-ec72c92c238219c8a67fb3b634a98c99aa64140274aa1e1a680740c045431b01 WatchSource:0}: Error finding container ec72c92c238219c8a67fb3b634a98c99aa64140274aa1e1a680740c045431b01: Status 404 returned error can't find the container with id ec72c92c238219c8a67fb3b634a98c99aa64140274aa1e1a680740c045431b01 Nov 22 04:05:12 crc kubenswrapper[4922]: I1122 04:05:12.633011 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerStarted","Data":"ec72c92c238219c8a67fb3b634a98c99aa64140274aa1e1a680740c045431b01"} Nov 22 04:05:14 crc kubenswrapper[4922]: I1122 04:05:14.650612 4922 generic.go:334] "Generic (PLEG): container finished" podID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerID="c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f" exitCode=0 Nov 22 04:05:14 crc kubenswrapper[4922]: I1122 04:05:14.650764 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerDied","Data":"c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f"} Nov 22 04:05:16 crc kubenswrapper[4922]: I1122 04:05:16.672249 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerStarted","Data":"25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670"} Nov 22 04:05:19 crc kubenswrapper[4922]: I1122 04:05:19.700961 4922 generic.go:334] "Generic (PLEG): container finished" podID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerID="25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670" exitCode=0 Nov 22 04:05:19 crc kubenswrapper[4922]: I1122 04:05:19.701267 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerDied","Data":"25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670"} Nov 22 04:05:21 crc kubenswrapper[4922]: I1122 04:05:21.300742 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:05:21 crc kubenswrapper[4922]: E1122 04:05:21.301459 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:05:22 crc kubenswrapper[4922]: I1122 04:05:22.729274 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerStarted","Data":"9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d"} Nov 22 04:05:22 crc kubenswrapper[4922]: I1122 04:05:22.754537 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6b25" podStartSLOduration=4.998602316 podStartE2EDuration="11.754515909s" podCreationTimestamp="2025-11-22 04:05:11 +0000 UTC" firstStartedPulling="2025-11-22 04:05:14.653325654 +0000 UTC m=+4350.691847556" lastFinishedPulling="2025-11-22 04:05:21.409239257 +0000 UTC m=+4357.447761149" observedRunningTime="2025-11-22 04:05:22.749895092 +0000 UTC m=+4358.788416984" watchObservedRunningTime="2025-11-22 04:05:22.754515909 +0000 UTC m=+4358.793037801" Nov 22 04:05:28 crc kubenswrapper[4922]: I1122 04:05:28.878137 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtvz"] Nov 22 04:05:28 crc kubenswrapper[4922]: I1122 04:05:28.881579 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:28 crc kubenswrapper[4922]: I1122 04:05:28.888027 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtvz"] Nov 22 04:05:28 crc kubenswrapper[4922]: I1122 04:05:28.929140 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dv6\" (UniqueName: \"kubernetes.io/projected/56218a5c-c5b9-487b-9aed-fb763bb4dc15-kube-api-access-w8dv6\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:28 crc kubenswrapper[4922]: I1122 04:05:28.929581 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-catalog-content\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:28 crc kubenswrapper[4922]: I1122 04:05:28.929732 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-utilities\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.032096 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-utilities\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.032186 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dv6\" (UniqueName: \"kubernetes.io/projected/56218a5c-c5b9-487b-9aed-fb763bb4dc15-kube-api-access-w8dv6\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.032243 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-catalog-content\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.032593 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-utilities\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.032627 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-catalog-content\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.056790 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dv6\" (UniqueName: \"kubernetes.io/projected/56218a5c-c5b9-487b-9aed-fb763bb4dc15-kube-api-access-w8dv6\") pod \"redhat-marketplace-rvtvz\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.215688 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.722194 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtvz"] Nov 22 04:05:29 crc kubenswrapper[4922]: W1122 04:05:29.729454 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56218a5c_c5b9_487b_9aed_fb763bb4dc15.slice/crio-83beb2415301a5a069b4897dcf6ee9c369b2d92faf8215cd0fc4b87f3542de3a WatchSource:0}: Error finding container 83beb2415301a5a069b4897dcf6ee9c369b2d92faf8215cd0fc4b87f3542de3a: Status 404 returned error can't find the container with id 83beb2415301a5a069b4897dcf6ee9c369b2d92faf8215cd0fc4b87f3542de3a Nov 22 04:05:29 crc kubenswrapper[4922]: I1122 04:05:29.799911 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerStarted","Data":"83beb2415301a5a069b4897dcf6ee9c369b2d92faf8215cd0fc4b87f3542de3a"} Nov 22 04:05:30 crc kubenswrapper[4922]: I1122 04:05:30.812473 4922 generic.go:334] "Generic (PLEG): container finished" podID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerID="3926b4db3586e52186ccfc19c3cc8d1c85b389a4d152bc39d3bac89fdbaf147c" exitCode=0 Nov 22 04:05:30 crc kubenswrapper[4922]: I1122 04:05:30.812572 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerDied","Data":"3926b4db3586e52186ccfc19c3cc8d1c85b389a4d152bc39d3bac89fdbaf147c"} Nov 22 04:05:32 crc kubenswrapper[4922]: I1122 04:05:32.100056 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:32 crc kubenswrapper[4922]: I1122 04:05:32.100115 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:32 crc kubenswrapper[4922]: I1122 04:05:32.161822 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:32 crc kubenswrapper[4922]: I1122 04:05:32.855927 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerStarted","Data":"3da2a37bf762aba16e312c5307f6976d26fc691dbac75bfa3ea36ae2bd988dd7"} Nov 22 04:05:32 crc kubenswrapper[4922]: I1122 04:05:32.914999 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:33 crc kubenswrapper[4922]: I1122 04:05:33.456356 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6b25"] Nov 22 04:05:34 crc kubenswrapper[4922]: I1122 04:05:34.300493 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:05:34 crc kubenswrapper[4922]: E1122 04:05:34.301107 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:05:34 crc kubenswrapper[4922]: I1122 04:05:34.892434 4922 generic.go:334] "Generic (PLEG): container finished" podID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerID="3da2a37bf762aba16e312c5307f6976d26fc691dbac75bfa3ea36ae2bd988dd7" exitCode=0 Nov 22 04:05:34 crc kubenswrapper[4922]: I1122 04:05:34.892510 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerDied","Data":"3da2a37bf762aba16e312c5307f6976d26fc691dbac75bfa3ea36ae2bd988dd7"} Nov 22 04:05:34 crc kubenswrapper[4922]: I1122 04:05:34.893044 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k6b25" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="registry-server" containerID="cri-o://9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d" gracePeriod=2 Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.872056 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.913203 4922 generic.go:334] "Generic (PLEG): container finished" podID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerID="9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d" exitCode=0 Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.913240 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerDied","Data":"9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d"} Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.913265 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6b25" event={"ID":"5a00fb19-f517-4a06-9e9d-0f8389c5f686","Type":"ContainerDied","Data":"ec72c92c238219c8a67fb3b634a98c99aa64140274aa1e1a680740c045431b01"} Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.913280 4922 scope.go:117] "RemoveContainer" containerID="9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d" Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.913386 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6b25" Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.941727 4922 scope.go:117] "RemoveContainer" containerID="25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670" Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.974813 4922 scope.go:117] "RemoveContainer" containerID="c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f" Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.997896 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5rqh\" (UniqueName: \"kubernetes.io/projected/5a00fb19-f517-4a06-9e9d-0f8389c5f686-kube-api-access-p5rqh\") pod \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.998462 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-catalog-content\") pod \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " Nov 22 04:05:35 crc kubenswrapper[4922]: I1122 04:05:35.998799 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-utilities\") pod \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\" (UID: \"5a00fb19-f517-4a06-9e9d-0f8389c5f686\") " Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.001671 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-utilities" (OuterVolumeSpecName: "utilities") pod "5a00fb19-f517-4a06-9e9d-0f8389c5f686" (UID: "5a00fb19-f517-4a06-9e9d-0f8389c5f686"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.006934 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a00fb19-f517-4a06-9e9d-0f8389c5f686-kube-api-access-p5rqh" (OuterVolumeSpecName: "kube-api-access-p5rqh") pod "5a00fb19-f517-4a06-9e9d-0f8389c5f686" (UID: "5a00fb19-f517-4a06-9e9d-0f8389c5f686"). InnerVolumeSpecName "kube-api-access-p5rqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.031087 4922 scope.go:117] "RemoveContainer" containerID="9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d" Nov 22 04:05:36 crc kubenswrapper[4922]: E1122 04:05:36.034213 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d\": container with ID starting with 9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d not found: ID does not exist" containerID="9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.034242 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d"} err="failed to get container status \"9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d\": rpc error: code = NotFound desc = could not find container \"9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d\": container with ID starting with 9a4031fe915e12eb1028aed94cbd55f4b32e466ee4966d889630d01d3ca4386d not found: ID does not exist" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.034270 4922 scope.go:117] "RemoveContainer" containerID="25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670" Nov 22 04:05:36 crc kubenswrapper[4922]: E1122 04:05:36.034517 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670\": container with ID starting with 25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670 not found: ID does not exist" containerID="25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.034542 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670"} err="failed to get container status \"25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670\": rpc error: code = NotFound desc = could not find container \"25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670\": container with ID starting with 25715881f716b872034569bd8072d3359987bdab4c986e844651c5a0891a7670 not found: ID does not exist" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.034558 4922 scope.go:117] "RemoveContainer" containerID="c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f" Nov 22 04:05:36 crc kubenswrapper[4922]: E1122 04:05:36.035227 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f\": container with ID starting with c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f not found: ID does not exist" containerID="c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.035245 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f"} err="failed to get container status \"c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f\": rpc error: code = NotFound desc = could not find container \"c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f\": container with ID starting with c07524192c323d131f93d5de9cb721c30b0b48788ed2be3749491948cdc74b9f not found: ID does not exist" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.060513 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a00fb19-f517-4a06-9e9d-0f8389c5f686" (UID: "5a00fb19-f517-4a06-9e9d-0f8389c5f686"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.102105 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5rqh\" (UniqueName: \"kubernetes.io/projected/5a00fb19-f517-4a06-9e9d-0f8389c5f686-kube-api-access-p5rqh\") on node \"crc\" DevicePath \"\"" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.102146 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.102158 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a00fb19-f517-4a06-9e9d-0f8389c5f686-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.261802 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6b25"] Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.271645 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k6b25"] Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.928555 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerStarted","Data":"47895988e6a592c62f52d02869212dd9f82c33761eb6f1e1ff0518197c8d646c"} Nov 22 04:05:36 crc kubenswrapper[4922]: I1122 04:05:36.964420 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvtvz" podStartSLOduration=4.129879106 podStartE2EDuration="8.964390374s" podCreationTimestamp="2025-11-22 04:05:28 +0000 UTC" firstStartedPulling="2025-11-22 04:05:30.81548149 +0000 UTC m=+4366.854003382" lastFinishedPulling="2025-11-22 04:05:35.649992748 +0000 UTC m=+4371.688514650" observedRunningTime="2025-11-22 04:05:36.952264292 +0000 UTC m=+4372.990786184" watchObservedRunningTime="2025-11-22 04:05:36.964390374 +0000 UTC m=+4373.002912276" Nov 22 04:05:37 crc kubenswrapper[4922]: I1122 04:05:37.319482 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" path="/var/lib/kubelet/pods/5a00fb19-f517-4a06-9e9d-0f8389c5f686/volumes" Nov 22 04:05:39 crc kubenswrapper[4922]: I1122 04:05:39.215998 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:39 crc kubenswrapper[4922]: I1122 04:05:39.216301 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:39 crc kubenswrapper[4922]: I1122 04:05:39.875720 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:47 crc kubenswrapper[4922]: I1122 04:05:47.303689 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:05:47 crc kubenswrapper[4922]: E1122 04:05:47.304735 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:05:49 crc kubenswrapper[4922]: I1122 04:05:49.298998 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:49 crc kubenswrapper[4922]: I1122 04:05:49.368163 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtvz"] Nov 22 04:05:50 crc kubenswrapper[4922]: I1122 04:05:50.075212 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rvtvz" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="registry-server" containerID="cri-o://47895988e6a592c62f52d02869212dd9f82c33761eb6f1e1ff0518197c8d646c" gracePeriod=2 Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.091314 4922 generic.go:334] "Generic (PLEG): container finished" podID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerID="47895988e6a592c62f52d02869212dd9f82c33761eb6f1e1ff0518197c8d646c" exitCode=0 Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.091409 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerDied","Data":"47895988e6a592c62f52d02869212dd9f82c33761eb6f1e1ff0518197c8d646c"} Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.874396 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.952831 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dv6\" (UniqueName: \"kubernetes.io/projected/56218a5c-c5b9-487b-9aed-fb763bb4dc15-kube-api-access-w8dv6\") pod \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.953050 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-catalog-content\") pod \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.953202 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-utilities\") pod \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\" (UID: \"56218a5c-c5b9-487b-9aed-fb763bb4dc15\") " Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.954603 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-utilities" (OuterVolumeSpecName: "utilities") pod "56218a5c-c5b9-487b-9aed-fb763bb4dc15" (UID: "56218a5c-c5b9-487b-9aed-fb763bb4dc15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.960818 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56218a5c-c5b9-487b-9aed-fb763bb4dc15-kube-api-access-w8dv6" (OuterVolumeSpecName: "kube-api-access-w8dv6") pod "56218a5c-c5b9-487b-9aed-fb763bb4dc15" (UID: "56218a5c-c5b9-487b-9aed-fb763bb4dc15"). InnerVolumeSpecName "kube-api-access-w8dv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:05:51 crc kubenswrapper[4922]: I1122 04:05:51.990570 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56218a5c-c5b9-487b-9aed-fb763bb4dc15" (UID: "56218a5c-c5b9-487b-9aed-fb763bb4dc15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.056193 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dv6\" (UniqueName: \"kubernetes.io/projected/56218a5c-c5b9-487b-9aed-fb763bb4dc15-kube-api-access-w8dv6\") on node \"crc\" DevicePath \"\"" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.056265 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.056280 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56218a5c-c5b9-487b-9aed-fb763bb4dc15-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.103881 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvtvz" event={"ID":"56218a5c-c5b9-487b-9aed-fb763bb4dc15","Type":"ContainerDied","Data":"83beb2415301a5a069b4897dcf6ee9c369b2d92faf8215cd0fc4b87f3542de3a"} Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.103938 4922 scope.go:117] "RemoveContainer" containerID="47895988e6a592c62f52d02869212dd9f82c33761eb6f1e1ff0518197c8d646c" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.103958 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvtvz" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.134770 4922 scope.go:117] "RemoveContainer" containerID="3da2a37bf762aba16e312c5307f6976d26fc691dbac75bfa3ea36ae2bd988dd7" Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.153473 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtvz"] Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.170296 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvtvz"] Nov 22 04:05:52 crc kubenswrapper[4922]: I1122 04:05:52.233167 4922 scope.go:117] "RemoveContainer" containerID="3926b4db3586e52186ccfc19c3cc8d1c85b389a4d152bc39d3bac89fdbaf147c" Nov 22 04:05:53 crc kubenswrapper[4922]: I1122 04:05:53.314360 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" path="/var/lib/kubelet/pods/56218a5c-c5b9-487b-9aed-fb763bb4dc15/volumes" Nov 22 04:06:01 crc kubenswrapper[4922]: I1122 04:06:01.302706 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:06:01 crc kubenswrapper[4922]: E1122 04:06:01.304197 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:06:16 crc kubenswrapper[4922]: I1122 04:06:16.300375 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:06:16 crc kubenswrapper[4922]: E1122 04:06:16.301261 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:06:30 crc kubenswrapper[4922]: I1122 04:06:30.301051 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:06:30 crc kubenswrapper[4922]: E1122 04:06:30.302276 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:06:43 crc kubenswrapper[4922]: I1122 04:06:43.300453 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:06:43 crc kubenswrapper[4922]: E1122 04:06:43.301122 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:06:56 crc kubenswrapper[4922]: I1122 04:06:56.301097 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:06:56 crc kubenswrapper[4922]: E1122 04:06:56.302251 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:07:10 crc kubenswrapper[4922]: I1122 04:07:10.301568 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:07:10 crc kubenswrapper[4922]: E1122 04:07:10.302916 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:07:25 crc kubenswrapper[4922]: I1122 04:07:25.337280 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:07:25 crc kubenswrapper[4922]: E1122 04:07:25.340402 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:07:41 crc kubenswrapper[4922]: I1122 04:07:41.301631 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:07:41 crc kubenswrapper[4922]: E1122 04:07:41.303023 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:07:53 crc kubenswrapper[4922]: I1122 04:07:53.300630 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:07:53 crc kubenswrapper[4922]: E1122 04:07:53.301926 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:08:06 crc kubenswrapper[4922]: I1122 04:08:06.301321 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:08:06 crc kubenswrapper[4922]: E1122 04:08:06.302529 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:08:19 crc kubenswrapper[4922]: I1122 04:08:19.301932 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:08:19 crc kubenswrapper[4922]: E1122 04:08:19.303038 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:08:32 crc kubenswrapper[4922]: I1122 04:08:32.301477 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:08:32 crc kubenswrapper[4922]: E1122 04:08:32.302458 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:08:43 crc kubenswrapper[4922]: I1122 04:08:43.300741 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:08:43 crc kubenswrapper[4922]: I1122 04:08:43.893562 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"ecf9aaa0256d3bc955d91a21b7f89502668827fabefcf5955efb01bf6dd1a0ec"} Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.496373 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xvnj5"] Nov 22 04:10:29 crc kubenswrapper[4922]: E1122 04:10:29.499882 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="registry-server" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.499907 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="registry-server" Nov 22 04:10:29 crc kubenswrapper[4922]: E1122 04:10:29.499947 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="extract-utilities" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.499960 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="extract-utilities" Nov 22 04:10:29 crc kubenswrapper[4922]: E1122 04:10:29.499986 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="extract-content" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.499999 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="extract-content" Nov 22 04:10:29 crc kubenswrapper[4922]: E1122 04:10:29.500015 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="registry-server" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.500030 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="registry-server" Nov 22 04:10:29 crc kubenswrapper[4922]: E1122 04:10:29.500048 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="extract-content" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.500062 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="extract-content" Nov 22 04:10:29 crc kubenswrapper[4922]: E1122 04:10:29.500092 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="extract-utilities" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.500103 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="extract-utilities" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.500462 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="56218a5c-c5b9-487b-9aed-fb763bb4dc15" containerName="registry-server" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.500500 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a00fb19-f517-4a06-9e9d-0f8389c5f686" containerName="registry-server" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.503047 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.537601 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvnj5"] Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.669106 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svrsl\" (UniqueName: \"kubernetes.io/projected/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-kube-api-access-svrsl\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.669415 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-utilities\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.669521 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-catalog-content\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.771723 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-utilities\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.771892 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-catalog-content\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.772004 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svrsl\" (UniqueName: \"kubernetes.io/projected/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-kube-api-access-svrsl\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.772953 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-utilities\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.773246 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-catalog-content\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.802464 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svrsl\" (UniqueName: \"kubernetes.io/projected/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-kube-api-access-svrsl\") pod \"redhat-operators-xvnj5\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:29 crc kubenswrapper[4922]: I1122 04:10:29.824670 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:30 crc kubenswrapper[4922]: I1122 04:10:30.362669 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvnj5"] Nov 22 04:10:30 crc kubenswrapper[4922]: I1122 04:10:30.949707 4922 generic.go:334] "Generic (PLEG): container finished" podID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerID="8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c" exitCode=0 Nov 22 04:10:30 crc kubenswrapper[4922]: I1122 04:10:30.949774 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerDied","Data":"8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c"} Nov 22 04:10:30 crc kubenswrapper[4922]: I1122 04:10:30.950040 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerStarted","Data":"0432d527dba6e262d2f1a106f9a871b00b4dca4373f4eefdf2cf209d1d2b42a6"} Nov 22 04:10:30 crc kubenswrapper[4922]: I1122 04:10:30.952013 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:10:31 crc kubenswrapper[4922]: I1122 04:10:31.964291 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerStarted","Data":"1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c"} Nov 22 04:10:36 crc kubenswrapper[4922]: I1122 04:10:36.011026 4922 generic.go:334] "Generic (PLEG): container finished" podID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerID="1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c" exitCode=0 Nov 22 04:10:36 crc kubenswrapper[4922]: I1122 04:10:36.011811 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerDied","Data":"1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c"} Nov 22 04:10:37 crc kubenswrapper[4922]: I1122 04:10:37.020267 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerStarted","Data":"6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b"} Nov 22 04:10:37 crc kubenswrapper[4922]: I1122 04:10:37.040463 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xvnj5" podStartSLOduration=2.57127022 podStartE2EDuration="8.040447017s" podCreationTimestamp="2025-11-22 04:10:29 +0000 UTC" firstStartedPulling="2025-11-22 04:10:30.9517783 +0000 UTC m=+4666.990300192" lastFinishedPulling="2025-11-22 04:10:36.420955057 +0000 UTC m=+4672.459476989" observedRunningTime="2025-11-22 04:10:37.035460721 +0000 UTC m=+4673.073982623" watchObservedRunningTime="2025-11-22 04:10:37.040447017 +0000 UTC m=+4673.078968909" Nov 22 04:10:39 crc kubenswrapper[4922]: I1122 04:10:39.824993 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:39 crc kubenswrapper[4922]: I1122 04:10:39.825521 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:40 crc kubenswrapper[4922]: I1122 04:10:40.869315 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xvnj5" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="registry-server" probeResult="failure" output=< Nov 22 04:10:40 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 04:10:40 crc kubenswrapper[4922]: > Nov 22 04:10:49 crc kubenswrapper[4922]: I1122 04:10:49.872959 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:49 crc kubenswrapper[4922]: I1122 04:10:49.923015 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:50 crc kubenswrapper[4922]: I1122 04:10:50.121901 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvnj5"] Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.153455 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xvnj5" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="registry-server" containerID="cri-o://6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b" gracePeriod=2 Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.792881 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.852070 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svrsl\" (UniqueName: \"kubernetes.io/projected/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-kube-api-access-svrsl\") pod \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.852184 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-catalog-content\") pod \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.852256 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-utilities\") pod \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\" (UID: \"34361dc1-d28e-4f60-a6ac-a4f60d9b176b\") " Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.853433 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-utilities" (OuterVolumeSpecName: "utilities") pod "34361dc1-d28e-4f60-a6ac-a4f60d9b176b" (UID: "34361dc1-d28e-4f60-a6ac-a4f60d9b176b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.859304 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-kube-api-access-svrsl" (OuterVolumeSpecName: "kube-api-access-svrsl") pod "34361dc1-d28e-4f60-a6ac-a4f60d9b176b" (UID: "34361dc1-d28e-4f60-a6ac-a4f60d9b176b"). InnerVolumeSpecName "kube-api-access-svrsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.954456 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:51 crc kubenswrapper[4922]: I1122 04:10:51.954506 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svrsl\" (UniqueName: \"kubernetes.io/projected/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-kube-api-access-svrsl\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.002005 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34361dc1-d28e-4f60-a6ac-a4f60d9b176b" (UID: "34361dc1-d28e-4f60-a6ac-a4f60d9b176b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.056262 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34361dc1-d28e-4f60-a6ac-a4f60d9b176b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.187412 4922 generic.go:334] "Generic (PLEG): container finished" podID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerID="6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b" exitCode=0 Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.187476 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerDied","Data":"6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b"} Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.187540 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvnj5" event={"ID":"34361dc1-d28e-4f60-a6ac-a4f60d9b176b","Type":"ContainerDied","Data":"0432d527dba6e262d2f1a106f9a871b00b4dca4373f4eefdf2cf209d1d2b42a6"} Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.187562 4922 scope.go:117] "RemoveContainer" containerID="6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.187577 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvnj5" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.237298 4922 scope.go:117] "RemoveContainer" containerID="1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.245985 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvnj5"] Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.258061 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xvnj5"] Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.284378 4922 scope.go:117] "RemoveContainer" containerID="8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.317677 4922 scope.go:117] "RemoveContainer" containerID="6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b" Nov 22 04:10:52 crc kubenswrapper[4922]: E1122 04:10:52.318508 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b\": container with ID starting with 6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b not found: ID does not exist" containerID="6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.318580 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b"} err="failed to get container status \"6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b\": rpc error: code = NotFound desc = could not find container \"6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b\": container with ID starting with 6116bb1a38989849013e650b7875f2ed4253531c62e6bc03c290877398606d3b not found: ID does not exist" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.318625 4922 scope.go:117] "RemoveContainer" containerID="1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c" Nov 22 04:10:52 crc kubenswrapper[4922]: E1122 04:10:52.319256 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c\": container with ID starting with 1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c not found: ID does not exist" containerID="1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.319311 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c"} err="failed to get container status \"1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c\": rpc error: code = NotFound desc = could not find container \"1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c\": container with ID starting with 1ad512e526dcf6c8d4381e5ec30c10538127e9e049f14db1370dae85fef3d67c not found: ID does not exist" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.319342 4922 scope.go:117] "RemoveContainer" containerID="8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c" Nov 22 04:10:52 crc kubenswrapper[4922]: E1122 04:10:52.319791 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c\": container with ID starting with 8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c not found: ID does not exist" containerID="8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c" Nov 22 04:10:52 crc kubenswrapper[4922]: I1122 04:10:52.319838 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c"} err="failed to get container status \"8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c\": rpc error: code = NotFound desc = could not find container \"8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c\": container with ID starting with 8edc9308bc892d07fe240707d0ab3a075e0929e11700542171d3937ca208383c not found: ID does not exist" Nov 22 04:10:53 crc kubenswrapper[4922]: I1122 04:10:53.312276 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" path="/var/lib/kubelet/pods/34361dc1-d28e-4f60-a6ac-a4f60d9b176b/volumes" Nov 22 04:11:11 crc kubenswrapper[4922]: I1122 04:11:11.109739 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:11:11 crc kubenswrapper[4922]: I1122 04:11:11.110434 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:11:41 crc kubenswrapper[4922]: I1122 04:11:41.110001 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:11:41 crc kubenswrapper[4922]: I1122 04:11:41.110786 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:12:11 crc kubenswrapper[4922]: I1122 04:12:11.109811 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:12:11 crc kubenswrapper[4922]: I1122 04:12:11.110623 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:12:11 crc kubenswrapper[4922]: I1122 04:12:11.110704 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 04:12:11 crc kubenswrapper[4922]: I1122 04:12:11.111987 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ecf9aaa0256d3bc955d91a21b7f89502668827fabefcf5955efb01bf6dd1a0ec"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:12:11 crc kubenswrapper[4922]: I1122 04:12:11.112099 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://ecf9aaa0256d3bc955d91a21b7f89502668827fabefcf5955efb01bf6dd1a0ec" gracePeriod=600 Nov 22 04:12:12 crc kubenswrapper[4922]: I1122 04:12:12.032934 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="ecf9aaa0256d3bc955d91a21b7f89502668827fabefcf5955efb01bf6dd1a0ec" exitCode=0 Nov 22 04:12:12 crc kubenswrapper[4922]: I1122 04:12:12.033134 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"ecf9aaa0256d3bc955d91a21b7f89502668827fabefcf5955efb01bf6dd1a0ec"} Nov 22 04:12:12 crc kubenswrapper[4922]: I1122 04:12:12.033548 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5"} Nov 22 04:12:12 crc kubenswrapper[4922]: I1122 04:12:12.033573 4922 scope.go:117] "RemoveContainer" containerID="4163351f257b242e300f88a0b2d981705082c49d726595120484aad5252d4ad2" Nov 22 04:14:11 crc kubenswrapper[4922]: I1122 04:14:11.110024 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:14:11 crc kubenswrapper[4922]: I1122 04:14:11.110668 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:27 crc kubenswrapper[4922]: I1122 04:14:27.510397 4922 generic.go:334] "Generic (PLEG): container finished" podID="65f03af2-87a1-4f4f-b09c-00fe2a3d4943" containerID="61527639731a9b825273f57e13396f3ce5387671b2a8e33ae65427ba02f4840d" exitCode=1 Nov 22 04:14:27 crc kubenswrapper[4922]: I1122 04:14:27.510551 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65f03af2-87a1-4f4f-b09c-00fe2a3d4943","Type":"ContainerDied","Data":"61527639731a9b825273f57e13396f3ce5387671b2a8e33ae65427ba02f4840d"} Nov 22 04:14:28 crc kubenswrapper[4922]: I1122 04:14:28.919081 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.044212 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-config-data\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.045260 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config-secret\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.046322 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.046457 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ssh-key\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.046631 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.046743 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-workdir\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.045830 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-config-data" (OuterVolumeSpecName: "config-data") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.047200 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ca-certs\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.047363 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-temporary\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.047475 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c7lc\" (UniqueName: \"kubernetes.io/projected/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-kube-api-access-9c7lc\") pod \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\" (UID: \"65f03af2-87a1-4f4f-b09c-00fe2a3d4943\") " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.048774 4922 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.048046 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.055943 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.064953 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-kube-api-access-9c7lc" (OuterVolumeSpecName: "kube-api-access-9c7lc") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "kube-api-access-9c7lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.065031 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.079654 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.085514 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.087986 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.104453 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "65f03af2-87a1-4f4f-b09c-00fe2a3d4943" (UID: "65f03af2-87a1-4f4f-b09c-00fe2a3d4943"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150499 4922 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150542 4922 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150557 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c7lc\" (UniqueName: \"kubernetes.io/projected/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-kube-api-access-9c7lc\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150573 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150616 4922 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150627 4922 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150640 4922 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.150651 4922 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65f03af2-87a1-4f4f-b09c-00fe2a3d4943-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.175166 4922 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.252561 4922 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.538876 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65f03af2-87a1-4f4f-b09c-00fe2a3d4943","Type":"ContainerDied","Data":"0735a707b83760c5e430b1e6905d424c0256ab866269415101899b0204eeca1b"} Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.538926 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0735a707b83760c5e430b1e6905d424c0256ab866269415101899b0204eeca1b" Nov 22 04:14:29 crc kubenswrapper[4922]: I1122 04:14:29.538993 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.699624 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4sjb9"] Nov 22 04:14:30 crc kubenswrapper[4922]: E1122 04:14:30.700031 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f03af2-87a1-4f4f-b09c-00fe2a3d4943" containerName="tempest-tests-tempest-tests-runner" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.700044 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f03af2-87a1-4f4f-b09c-00fe2a3d4943" containerName="tempest-tests-tempest-tests-runner" Nov 22 04:14:30 crc kubenswrapper[4922]: E1122 04:14:30.700061 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="extract-utilities" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.700068 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="extract-utilities" Nov 22 04:14:30 crc kubenswrapper[4922]: E1122 04:14:30.700088 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="extract-content" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.700095 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="extract-content" Nov 22 04:14:30 crc kubenswrapper[4922]: E1122 04:14:30.700113 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="registry-server" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.700119 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="registry-server" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.700284 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="34361dc1-d28e-4f60-a6ac-a4f60d9b176b" containerName="registry-server" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.700298 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f03af2-87a1-4f4f-b09c-00fe2a3d4943" containerName="tempest-tests-tempest-tests-runner" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.701793 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.710319 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sjb9"] Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.780940 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-catalog-content\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.780981 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9ws\" (UniqueName: \"kubernetes.io/projected/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-kube-api-access-db9ws\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.781059 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-utilities\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.883511 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-catalog-content\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.883561 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9ws\" (UniqueName: \"kubernetes.io/projected/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-kube-api-access-db9ws\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.883632 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-utilities\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.884171 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-catalog-content\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.884258 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-utilities\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:30 crc kubenswrapper[4922]: I1122 04:14:30.902480 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9ws\" (UniqueName: \"kubernetes.io/projected/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-kube-api-access-db9ws\") pod \"community-operators-4sjb9\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:31 crc kubenswrapper[4922]: I1122 04:14:31.087966 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:31 crc kubenswrapper[4922]: I1122 04:14:31.620931 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4sjb9"] Nov 22 04:14:32 crc kubenswrapper[4922]: I1122 04:14:32.568430 4922 generic.go:334] "Generic (PLEG): container finished" podID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerID="ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486" exitCode=0 Nov 22 04:14:32 crc kubenswrapper[4922]: I1122 04:14:32.568496 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerDied","Data":"ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486"} Nov 22 04:14:32 crc kubenswrapper[4922]: I1122 04:14:32.568762 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerStarted","Data":"46fccc121262798f3abf9e1bb232b31e27aeef296dc545a7760adf4951e3a24d"} Nov 22 04:14:34 crc kubenswrapper[4922]: I1122 04:14:34.590834 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerStarted","Data":"8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e"} Nov 22 04:14:37 crc kubenswrapper[4922]: I1122 04:14:37.626051 4922 generic.go:334] "Generic (PLEG): container finished" podID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerID="8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e" exitCode=0 Nov 22 04:14:37 crc kubenswrapper[4922]: I1122 04:14:37.626113 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerDied","Data":"8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e"} Nov 22 04:14:38 crc kubenswrapper[4922]: I1122 04:14:38.640002 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerStarted","Data":"5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3"} Nov 22 04:14:38 crc kubenswrapper[4922]: I1122 04:14:38.667219 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4sjb9" podStartSLOduration=3.010662786 podStartE2EDuration="8.667195857s" podCreationTimestamp="2025-11-22 04:14:30 +0000 UTC" firstStartedPulling="2025-11-22 04:14:32.570282473 +0000 UTC m=+4908.608804365" lastFinishedPulling="2025-11-22 04:14:38.226815554 +0000 UTC m=+4914.265337436" observedRunningTime="2025-11-22 04:14:38.664569284 +0000 UTC m=+4914.703091176" watchObservedRunningTime="2025-11-22 04:14:38.667195857 +0000 UTC m=+4914.705717769" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.842674 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.848088 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.851426 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wdc5v" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.879463 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.894652 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.894704 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crnh6\" (UniqueName: \"kubernetes.io/projected/5459cbc2-5aa8-462f-aa76-8c6c55369173-kube-api-access-crnh6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.997158 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.997227 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crnh6\" (UniqueName: \"kubernetes.io/projected/5459cbc2-5aa8-462f-aa76-8c6c55369173-kube-api-access-crnh6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:40 crc kubenswrapper[4922]: I1122 04:14:40.997789 4922 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.029655 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crnh6\" (UniqueName: \"kubernetes.io/projected/5459cbc2-5aa8-462f-aa76-8c6c55369173-kube-api-access-crnh6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.052750 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5459cbc2-5aa8-462f-aa76-8c6c55369173\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.088587 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.088694 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.109919 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.110014 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.142938 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.173012 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 22 04:14:41 crc kubenswrapper[4922]: I1122 04:14:41.735446 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 22 04:14:42 crc kubenswrapper[4922]: I1122 04:14:42.681009 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5459cbc2-5aa8-462f-aa76-8c6c55369173","Type":"ContainerStarted","Data":"fb812e80ed28672714acd3dc3385493e95e3b2e4a4725f616df4716928f15ae1"} Nov 22 04:14:44 crc kubenswrapper[4922]: I1122 04:14:44.702263 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5459cbc2-5aa8-462f-aa76-8c6c55369173","Type":"ContainerStarted","Data":"3d6f44be6ebd2c1711682f5dca2f61da1e06f8d240e151562e7c40df49bbc96d"} Nov 22 04:14:44 crc kubenswrapper[4922]: I1122 04:14:44.742629 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.565681802 podStartE2EDuration="4.742601245s" podCreationTimestamp="2025-11-22 04:14:40 +0000 UTC" firstStartedPulling="2025-11-22 04:14:42.433577437 +0000 UTC m=+4918.472099379" lastFinishedPulling="2025-11-22 04:14:43.61049694 +0000 UTC m=+4919.649018822" observedRunningTime="2025-11-22 04:14:44.721136059 +0000 UTC m=+4920.759658021" watchObservedRunningTime="2025-11-22 04:14:44.742601245 +0000 UTC m=+4920.781123187" Nov 22 04:14:51 crc kubenswrapper[4922]: I1122 04:14:51.154138 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:51 crc kubenswrapper[4922]: I1122 04:14:51.225562 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sjb9"] Nov 22 04:14:51 crc kubenswrapper[4922]: I1122 04:14:51.775468 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4sjb9" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="registry-server" containerID="cri-o://5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3" gracePeriod=2 Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.565922 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.682277 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-catalog-content\") pod \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.682505 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db9ws\" (UniqueName: \"kubernetes.io/projected/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-kube-api-access-db9ws\") pod \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.682560 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-utilities\") pod \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\" (UID: \"4e2370d6-32ca-4d12-a8f4-657e5ca7040f\") " Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.683364 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-utilities" (OuterVolumeSpecName: "utilities") pod "4e2370d6-32ca-4d12-a8f4-657e5ca7040f" (UID: "4e2370d6-32ca-4d12-a8f4-657e5ca7040f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.691156 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-kube-api-access-db9ws" (OuterVolumeSpecName: "kube-api-access-db9ws") pod "4e2370d6-32ca-4d12-a8f4-657e5ca7040f" (UID: "4e2370d6-32ca-4d12-a8f4-657e5ca7040f"). InnerVolumeSpecName "kube-api-access-db9ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.733233 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e2370d6-32ca-4d12-a8f4-657e5ca7040f" (UID: "4e2370d6-32ca-4d12-a8f4-657e5ca7040f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.785318 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db9ws\" (UniqueName: \"kubernetes.io/projected/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-kube-api-access-db9ws\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.785357 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.785371 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e2370d6-32ca-4d12-a8f4-657e5ca7040f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.786728 4922 generic.go:334] "Generic (PLEG): container finished" podID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerID="5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3" exitCode=0 Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.786768 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerDied","Data":"5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3"} Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.786798 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4sjb9" event={"ID":"4e2370d6-32ca-4d12-a8f4-657e5ca7040f","Type":"ContainerDied","Data":"46fccc121262798f3abf9e1bb232b31e27aeef296dc545a7760adf4951e3a24d"} Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.786820 4922 scope.go:117] "RemoveContainer" containerID="5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.786984 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4sjb9" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.818151 4922 scope.go:117] "RemoveContainer" containerID="8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.835612 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4sjb9"] Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.844785 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4sjb9"] Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.854586 4922 scope.go:117] "RemoveContainer" containerID="ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.895624 4922 scope.go:117] "RemoveContainer" containerID="5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3" Nov 22 04:14:52 crc kubenswrapper[4922]: E1122 04:14:52.896771 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3\": container with ID starting with 5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3 not found: ID does not exist" containerID="5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.896921 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3"} err="failed to get container status \"5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3\": rpc error: code = NotFound desc = could not find container \"5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3\": container with ID starting with 5bb90dd2328cdcc3489f76ad409132d09a640aa13d8303c41f707abe8d5abfa3 not found: ID does not exist" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.897021 4922 scope.go:117] "RemoveContainer" containerID="8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e" Nov 22 04:14:52 crc kubenswrapper[4922]: E1122 04:14:52.897476 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e\": container with ID starting with 8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e not found: ID does not exist" containerID="8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.897496 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e"} err="failed to get container status \"8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e\": rpc error: code = NotFound desc = could not find container \"8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e\": container with ID starting with 8877465fd39fd17e81c4b19e6ac75334b1e49496ac9a3894f29175bf7f53553e not found: ID does not exist" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.897511 4922 scope.go:117] "RemoveContainer" containerID="ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486" Nov 22 04:14:52 crc kubenswrapper[4922]: E1122 04:14:52.897947 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486\": container with ID starting with ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486 not found: ID does not exist" containerID="ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486" Nov 22 04:14:52 crc kubenswrapper[4922]: I1122 04:14:52.898006 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486"} err="failed to get container status \"ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486\": rpc error: code = NotFound desc = could not find container \"ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486\": container with ID starting with ae14983978eccc9e6193a57b100bf31d364725c0e1e05341675849ffe5aff486 not found: ID does not exist" Nov 22 04:14:53 crc kubenswrapper[4922]: I1122 04:14:53.312311 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" path="/var/lib/kubelet/pods/4e2370d6-32ca-4d12-a8f4-657e5ca7040f/volumes" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.175809 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf"] Nov 22 04:15:00 crc kubenswrapper[4922]: E1122 04:15:00.179421 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="registry-server" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.179595 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="registry-server" Nov 22 04:15:00 crc kubenswrapper[4922]: E1122 04:15:00.179784 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="extract-utilities" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.179977 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="extract-utilities" Nov 22 04:15:00 crc kubenswrapper[4922]: E1122 04:15:00.180106 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="extract-content" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.180240 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="extract-content" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.180690 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2370d6-32ca-4d12-a8f4-657e5ca7040f" containerName="registry-server" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.182215 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.186418 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf"] Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.191004 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.191065 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.371386 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/987c5299-105f-4f32-97c0-dac7a83d530b-secret-volume\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.372047 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7h7n\" (UniqueName: \"kubernetes.io/projected/987c5299-105f-4f32-97c0-dac7a83d530b-kube-api-access-d7h7n\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.372139 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/987c5299-105f-4f32-97c0-dac7a83d530b-config-volume\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.474091 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/987c5299-105f-4f32-97c0-dac7a83d530b-secret-volume\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.478784 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7h7n\" (UniqueName: \"kubernetes.io/projected/987c5299-105f-4f32-97c0-dac7a83d530b-kube-api-access-d7h7n\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.478906 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/987c5299-105f-4f32-97c0-dac7a83d530b-config-volume\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.479776 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/987c5299-105f-4f32-97c0-dac7a83d530b-config-volume\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.491314 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/987c5299-105f-4f32-97c0-dac7a83d530b-secret-volume\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.506422 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7h7n\" (UniqueName: \"kubernetes.io/projected/987c5299-105f-4f32-97c0-dac7a83d530b-kube-api-access-d7h7n\") pod \"collect-profiles-29396415-trczf\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:00 crc kubenswrapper[4922]: I1122 04:15:00.806224 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:01 crc kubenswrapper[4922]: I1122 04:15:01.273074 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf"] Nov 22 04:15:01 crc kubenswrapper[4922]: I1122 04:15:01.923303 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" event={"ID":"987c5299-105f-4f32-97c0-dac7a83d530b","Type":"ContainerStarted","Data":"3f2eaa8b0b70ca677dc0b6f9af7aad4886b0a240177af16d2699c569de06f262"} Nov 22 04:15:01 crc kubenswrapper[4922]: I1122 04:15:01.923391 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" event={"ID":"987c5299-105f-4f32-97c0-dac7a83d530b","Type":"ContainerStarted","Data":"3f1fbe29000fd22cdcd9bfbb95300c3dc0a7ea8278f31adb6617490fb96fea1b"} Nov 22 04:15:02 crc kubenswrapper[4922]: I1122 04:15:02.939541 4922 generic.go:334] "Generic (PLEG): container finished" podID="987c5299-105f-4f32-97c0-dac7a83d530b" containerID="3f2eaa8b0b70ca677dc0b6f9af7aad4886b0a240177af16d2699c569de06f262" exitCode=0 Nov 22 04:15:02 crc kubenswrapper[4922]: I1122 04:15:02.939855 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" event={"ID":"987c5299-105f-4f32-97c0-dac7a83d530b","Type":"ContainerDied","Data":"3f2eaa8b0b70ca677dc0b6f9af7aad4886b0a240177af16d2699c569de06f262"} Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.424809 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.578139 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7h7n\" (UniqueName: \"kubernetes.io/projected/987c5299-105f-4f32-97c0-dac7a83d530b-kube-api-access-d7h7n\") pod \"987c5299-105f-4f32-97c0-dac7a83d530b\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.578337 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/987c5299-105f-4f32-97c0-dac7a83d530b-secret-volume\") pod \"987c5299-105f-4f32-97c0-dac7a83d530b\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.578465 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/987c5299-105f-4f32-97c0-dac7a83d530b-config-volume\") pod \"987c5299-105f-4f32-97c0-dac7a83d530b\" (UID: \"987c5299-105f-4f32-97c0-dac7a83d530b\") " Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.579219 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987c5299-105f-4f32-97c0-dac7a83d530b-config-volume" (OuterVolumeSpecName: "config-volume") pod "987c5299-105f-4f32-97c0-dac7a83d530b" (UID: "987c5299-105f-4f32-97c0-dac7a83d530b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.584821 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/987c5299-105f-4f32-97c0-dac7a83d530b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "987c5299-105f-4f32-97c0-dac7a83d530b" (UID: "987c5299-105f-4f32-97c0-dac7a83d530b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.587835 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987c5299-105f-4f32-97c0-dac7a83d530b-kube-api-access-d7h7n" (OuterVolumeSpecName: "kube-api-access-d7h7n") pod "987c5299-105f-4f32-97c0-dac7a83d530b" (UID: "987c5299-105f-4f32-97c0-dac7a83d530b"). InnerVolumeSpecName "kube-api-access-d7h7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.681520 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7h7n\" (UniqueName: \"kubernetes.io/projected/987c5299-105f-4f32-97c0-dac7a83d530b-kube-api-access-d7h7n\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.681577 4922 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/987c5299-105f-4f32-97c0-dac7a83d530b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.681596 4922 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/987c5299-105f-4f32-97c0-dac7a83d530b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.961394 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" event={"ID":"987c5299-105f-4f32-97c0-dac7a83d530b","Type":"ContainerDied","Data":"3f1fbe29000fd22cdcd9bfbb95300c3dc0a7ea8278f31adb6617490fb96fea1b"} Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.961439 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f1fbe29000fd22cdcd9bfbb95300c3dc0a7ea8278f31adb6617490fb96fea1b" Nov 22 04:15:04 crc kubenswrapper[4922]: I1122 04:15:04.961442 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-trczf" Nov 22 04:15:05 crc kubenswrapper[4922]: I1122 04:15:05.525687 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm"] Nov 22 04:15:05 crc kubenswrapper[4922]: I1122 04:15:05.533992 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396370-xgdhm"] Nov 22 04:15:07 crc kubenswrapper[4922]: I1122 04:15:07.323040 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f70de8-814c-42f0-900d-ed7ade87ecc8" path="/var/lib/kubelet/pods/13f70de8-814c-42f0-900d-ed7ade87ecc8/volumes" Nov 22 04:15:08 crc kubenswrapper[4922]: I1122 04:15:08.518250 4922 scope.go:117] "RemoveContainer" containerID="6319c0ad4dfc39e8cc40f08ae6ff8943d8c1ee7bab1690fa3ff6761fd77051c7" Nov 22 04:15:11 crc kubenswrapper[4922]: I1122 04:15:11.109970 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:15:11 crc kubenswrapper[4922]: I1122 04:15:11.110634 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:15:11 crc kubenswrapper[4922]: I1122 04:15:11.110697 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 04:15:11 crc kubenswrapper[4922]: I1122 04:15:11.111789 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:15:11 crc kubenswrapper[4922]: I1122 04:15:11.111941 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" gracePeriod=600 Nov 22 04:15:11 crc kubenswrapper[4922]: E1122 04:15:11.749058 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:15:12 crc kubenswrapper[4922]: I1122 04:15:12.040802 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" exitCode=0 Nov 22 04:15:12 crc kubenswrapper[4922]: I1122 04:15:12.040888 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5"} Nov 22 04:15:12 crc kubenswrapper[4922]: I1122 04:15:12.040982 4922 scope.go:117] "RemoveContainer" containerID="ecf9aaa0256d3bc955d91a21b7f89502668827fabefcf5955efb01bf6dd1a0ec" Nov 22 04:15:12 crc kubenswrapper[4922]: I1122 04:15:12.042025 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:15:12 crc kubenswrapper[4922]: E1122 04:15:12.042440 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.547324 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f47g7/must-gather-fpdpf"] Nov 22 04:15:16 crc kubenswrapper[4922]: E1122 04:15:16.548366 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987c5299-105f-4f32-97c0-dac7a83d530b" containerName="collect-profiles" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.548384 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="987c5299-105f-4f32-97c0-dac7a83d530b" containerName="collect-profiles" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.548625 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="987c5299-105f-4f32-97c0-dac7a83d530b" containerName="collect-profiles" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.549636 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.559416 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f47g7"/"openshift-service-ca.crt" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.559483 4922 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f47g7"/"kube-root-ca.crt" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.559415 4922 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f47g7"/"default-dockercfg-jlvfw" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.560304 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f47g7/must-gather-fpdpf"] Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.652711 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jhw6\" (UniqueName: \"kubernetes.io/projected/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-kube-api-access-2jhw6\") pod \"must-gather-fpdpf\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.652912 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-must-gather-output\") pod \"must-gather-fpdpf\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.753834 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-must-gather-output\") pod \"must-gather-fpdpf\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.754018 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jhw6\" (UniqueName: \"kubernetes.io/projected/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-kube-api-access-2jhw6\") pod \"must-gather-fpdpf\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.754215 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-must-gather-output\") pod \"must-gather-fpdpf\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.776221 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jhw6\" (UniqueName: \"kubernetes.io/projected/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-kube-api-access-2jhw6\") pod \"must-gather-fpdpf\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:16 crc kubenswrapper[4922]: I1122 04:15:16.876859 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:15:17 crc kubenswrapper[4922]: I1122 04:15:17.529597 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f47g7/must-gather-fpdpf"] Nov 22 04:15:18 crc kubenswrapper[4922]: I1122 04:15:18.112395 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/must-gather-fpdpf" event={"ID":"30b7ec5b-cd88-4c4e-944f-415cbf9241ae","Type":"ContainerStarted","Data":"9ad5c6a205d2acd795588405cd121253f8837e0193e7e57a6c4d756359e2db76"} Nov 22 04:15:25 crc kubenswrapper[4922]: I1122 04:15:25.178192 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/must-gather-fpdpf" event={"ID":"30b7ec5b-cd88-4c4e-944f-415cbf9241ae","Type":"ContainerStarted","Data":"8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886"} Nov 22 04:15:25 crc kubenswrapper[4922]: I1122 04:15:25.179004 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/must-gather-fpdpf" event={"ID":"30b7ec5b-cd88-4c4e-944f-415cbf9241ae","Type":"ContainerStarted","Data":"cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534"} Nov 22 04:15:25 crc kubenswrapper[4922]: I1122 04:15:25.204295 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f47g7/must-gather-fpdpf" podStartSLOduration=2.6953410079999998 podStartE2EDuration="9.204265393s" podCreationTimestamp="2025-11-22 04:15:16 +0000 UTC" firstStartedPulling="2025-11-22 04:15:17.537097774 +0000 UTC m=+4953.575619666" lastFinishedPulling="2025-11-22 04:15:24.046022159 +0000 UTC m=+4960.084544051" observedRunningTime="2025-11-22 04:15:25.197311426 +0000 UTC m=+4961.235833358" watchObservedRunningTime="2025-11-22 04:15:25.204265393 +0000 UTC m=+4961.242787315" Nov 22 04:15:26 crc kubenswrapper[4922]: I1122 04:15:26.301392 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:15:26 crc kubenswrapper[4922]: E1122 04:15:26.302085 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:15:29 crc kubenswrapper[4922]: I1122 04:15:29.994331 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f47g7/crc-debug-tq8rh"] Nov 22 04:15:29 crc kubenswrapper[4922]: I1122 04:15:29.995947 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.136327 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-host\") pod \"crc-debug-tq8rh\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.136753 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbtc5\" (UniqueName: \"kubernetes.io/projected/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-kube-api-access-wbtc5\") pod \"crc-debug-tq8rh\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.238452 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-host\") pod \"crc-debug-tq8rh\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.238572 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbtc5\" (UniqueName: \"kubernetes.io/projected/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-kube-api-access-wbtc5\") pod \"crc-debug-tq8rh\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.238611 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-host\") pod \"crc-debug-tq8rh\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.265958 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbtc5\" (UniqueName: \"kubernetes.io/projected/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-kube-api-access-wbtc5\") pod \"crc-debug-tq8rh\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:30 crc kubenswrapper[4922]: I1122 04:15:30.321274 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:15:31 crc kubenswrapper[4922]: I1122 04:15:31.265676 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" event={"ID":"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357","Type":"ContainerStarted","Data":"a9bf85deb0ac523c16cb283bf4393e01c733a39d58942e5605f7d15874e5df97"} Nov 22 04:15:37 crc kubenswrapper[4922]: I1122 04:15:37.300879 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:15:37 crc kubenswrapper[4922]: E1122 04:15:37.301506 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:15:48 crc kubenswrapper[4922]: I1122 04:15:48.300935 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:15:48 crc kubenswrapper[4922]: E1122 04:15:48.302099 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:15:51 crc kubenswrapper[4922]: I1122 04:15:51.916286 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6d6p"] Nov 22 04:15:51 crc kubenswrapper[4922]: I1122 04:15:51.921528 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:51 crc kubenswrapper[4922]: I1122 04:15:51.926531 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zlbns"] Nov 22 04:15:51 crc kubenswrapper[4922]: I1122 04:15:51.929057 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:51 crc kubenswrapper[4922]: I1122 04:15:51.946790 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zlbns"] Nov 22 04:15:51 crc kubenswrapper[4922]: I1122 04:15:51.962493 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6d6p"] Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.090395 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-utilities\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.090444 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-utilities\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.090482 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcszv\" (UniqueName: \"kubernetes.io/projected/b76b3b01-0899-47e6-b7e7-0bd4421686b3-kube-api-access-vcszv\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.090522 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-catalog-content\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.090588 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-catalog-content\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.090646 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt2mm\" (UniqueName: \"kubernetes.io/projected/8323c7d4-0974-48c0-8e32-22f1a5299ec4-kube-api-access-gt2mm\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192043 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-utilities\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192085 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-utilities\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192111 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcszv\" (UniqueName: \"kubernetes.io/projected/b76b3b01-0899-47e6-b7e7-0bd4421686b3-kube-api-access-vcszv\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192142 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-catalog-content\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192193 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-catalog-content\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192239 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt2mm\" (UniqueName: \"kubernetes.io/projected/8323c7d4-0974-48c0-8e32-22f1a5299ec4-kube-api-access-gt2mm\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192650 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-utilities\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192686 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-catalog-content\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.192698 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-catalog-content\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.193010 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-utilities\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.211579 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcszv\" (UniqueName: \"kubernetes.io/projected/b76b3b01-0899-47e6-b7e7-0bd4421686b3-kube-api-access-vcszv\") pod \"certified-operators-k6d6p\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.211590 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt2mm\" (UniqueName: \"kubernetes.io/projected/8323c7d4-0974-48c0-8e32-22f1a5299ec4-kube-api-access-gt2mm\") pod \"redhat-marketplace-zlbns\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.244145 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:15:52 crc kubenswrapper[4922]: I1122 04:15:52.257386 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:15:57 crc kubenswrapper[4922]: I1122 04:15:57.727352 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6d6p"] Nov 22 04:15:57 crc kubenswrapper[4922]: I1122 04:15:57.832301 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zlbns"] Nov 22 04:15:58 crc kubenswrapper[4922]: I1122 04:15:58.554356 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerStarted","Data":"15928092bf1bb1452e0c4bda6f95597a6bb7f43260161c4ad4f9285411a4baf0"} Nov 22 04:15:58 crc kubenswrapper[4922]: I1122 04:15:58.555982 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerStarted","Data":"1d03834248a4cbaee6c15100b4ac73cb9a60b343971a1219e0cc1864ffe561b6"} Nov 22 04:16:00 crc kubenswrapper[4922]: I1122 04:16:00.575596 4922 generic.go:334] "Generic (PLEG): container finished" podID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerID="213ecc4dd993c1a04e7a19c4321df756341e09cc7b0fd491d9511b1fe8731a1e" exitCode=0 Nov 22 04:16:00 crc kubenswrapper[4922]: I1122 04:16:00.575665 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerDied","Data":"213ecc4dd993c1a04e7a19c4321df756341e09cc7b0fd491d9511b1fe8731a1e"} Nov 22 04:16:00 crc kubenswrapper[4922]: I1122 04:16:00.577981 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:16:00 crc kubenswrapper[4922]: I1122 04:16:00.578243 4922 generic.go:334] "Generic (PLEG): container finished" podID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerID="87dc0df46b0a6cb6b1f4d7903a81b965d0869aeac3b0cb1ed5f46668ee368023" exitCode=0 Nov 22 04:16:00 crc kubenswrapper[4922]: I1122 04:16:00.578270 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerDied","Data":"87dc0df46b0a6cb6b1f4d7903a81b965d0869aeac3b0cb1ed5f46668ee368023"} Nov 22 04:16:01 crc kubenswrapper[4922]: I1122 04:16:01.300335 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:16:01 crc kubenswrapper[4922]: E1122 04:16:01.300575 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:16:11 crc kubenswrapper[4922]: E1122 04:16:11.836718 4922 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Nov 22 04:16:11 crc kubenswrapper[4922]: E1122 04:16:11.837704 4922 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbtc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-tq8rh_openshift-must-gather-f47g7(9c84f1ea-b8e3-42e7-9fc0-50c7e8722357): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:16:11 crc kubenswrapper[4922]: E1122 04:16:11.839575 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" podUID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" Nov 22 04:16:12 crc kubenswrapper[4922]: I1122 04:16:12.300444 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:16:12 crc kubenswrapper[4922]: E1122 04:16:12.301047 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:16:13 crc kubenswrapper[4922]: E1122 04:16:13.967196 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" podUID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" Nov 22 04:16:17 crc kubenswrapper[4922]: I1122 04:16:17.788604 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerStarted","Data":"c4d8494500731759956e7b2b580c4dd078d90b1554a6372b11d8c86d75b35788"} Nov 22 04:16:17 crc kubenswrapper[4922]: I1122 04:16:17.790808 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerStarted","Data":"46924f913af18e0c39558c84613f90c368572433a282383788c92e165bce7642"} Nov 22 04:16:19 crc kubenswrapper[4922]: I1122 04:16:19.810653 4922 generic.go:334] "Generic (PLEG): container finished" podID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerID="c4d8494500731759956e7b2b580c4dd078d90b1554a6372b11d8c86d75b35788" exitCode=0 Nov 22 04:16:19 crc kubenswrapper[4922]: I1122 04:16:19.810715 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerDied","Data":"c4d8494500731759956e7b2b580c4dd078d90b1554a6372b11d8c86d75b35788"} Nov 22 04:16:20 crc kubenswrapper[4922]: I1122 04:16:20.836952 4922 generic.go:334] "Generic (PLEG): container finished" podID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerID="46924f913af18e0c39558c84613f90c368572433a282383788c92e165bce7642" exitCode=0 Nov 22 04:16:20 crc kubenswrapper[4922]: I1122 04:16:20.837616 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerDied","Data":"46924f913af18e0c39558c84613f90c368572433a282383788c92e165bce7642"} Nov 22 04:16:20 crc kubenswrapper[4922]: I1122 04:16:20.848204 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerStarted","Data":"4cff3fa4cb94f2664cd4d00f8f7ac7a029b8cd46519f76f16d6cd0b0eb880576"} Nov 22 04:16:20 crc kubenswrapper[4922]: I1122 04:16:20.884294 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zlbns" podStartSLOduration=11.210352801 podStartE2EDuration="30.884269222s" podCreationTimestamp="2025-11-22 04:15:50 +0000 UTC" firstStartedPulling="2025-11-22 04:16:00.577706518 +0000 UTC m=+4996.616228430" lastFinishedPulling="2025-11-22 04:16:20.251622959 +0000 UTC m=+5016.290144851" observedRunningTime="2025-11-22 04:16:20.881338672 +0000 UTC m=+5016.919860564" watchObservedRunningTime="2025-11-22 04:16:20.884269222 +0000 UTC m=+5016.922791124" Nov 22 04:16:22 crc kubenswrapper[4922]: I1122 04:16:22.260099 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:16:22 crc kubenswrapper[4922]: I1122 04:16:22.260717 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:16:22 crc kubenswrapper[4922]: I1122 04:16:22.312876 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:16:22 crc kubenswrapper[4922]: I1122 04:16:22.871592 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerStarted","Data":"9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928"} Nov 22 04:16:22 crc kubenswrapper[4922]: I1122 04:16:22.896312 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6d6p" podStartSLOduration=12.909721747 podStartE2EDuration="33.896293653s" podCreationTimestamp="2025-11-22 04:15:49 +0000 UTC" firstStartedPulling="2025-11-22 04:16:00.581425607 +0000 UTC m=+4996.619947509" lastFinishedPulling="2025-11-22 04:16:21.567997523 +0000 UTC m=+5017.606519415" observedRunningTime="2025-11-22 04:16:22.888133837 +0000 UTC m=+5018.926655729" watchObservedRunningTime="2025-11-22 04:16:22.896293653 +0000 UTC m=+5018.934815545" Nov 22 04:16:25 crc kubenswrapper[4922]: I1122 04:16:25.311962 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:16:25 crc kubenswrapper[4922]: E1122 04:16:25.313058 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:16:30 crc kubenswrapper[4922]: I1122 04:16:30.950757 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" event={"ID":"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357","Type":"ContainerStarted","Data":"b1c2b8c23d344aaf282dcdefb4ea2786b54e5e5d3d0db1b71021365b2ee9364a"} Nov 22 04:16:30 crc kubenswrapper[4922]: I1122 04:16:30.970414 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" podStartSLOduration=2.324759774 podStartE2EDuration="1m1.970394771s" podCreationTimestamp="2025-11-22 04:15:29 +0000 UTC" firstStartedPulling="2025-11-22 04:15:30.349556899 +0000 UTC m=+4966.388078791" lastFinishedPulling="2025-11-22 04:16:29.995191896 +0000 UTC m=+5026.033713788" observedRunningTime="2025-11-22 04:16:30.966047096 +0000 UTC m=+5027.004568998" watchObservedRunningTime="2025-11-22 04:16:30.970394771 +0000 UTC m=+5027.008916663" Nov 22 04:16:32 crc kubenswrapper[4922]: I1122 04:16:32.244255 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:16:32 crc kubenswrapper[4922]: I1122 04:16:32.246185 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:16:32 crc kubenswrapper[4922]: I1122 04:16:32.329505 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:16:32 crc kubenswrapper[4922]: I1122 04:16:32.332383 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:16:32 crc kubenswrapper[4922]: I1122 04:16:32.568280 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zlbns"] Nov 22 04:16:32 crc kubenswrapper[4922]: I1122 04:16:32.970550 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zlbns" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="registry-server" containerID="cri-o://4cff3fa4cb94f2664cd4d00f8f7ac7a029b8cd46519f76f16d6cd0b0eb880576" gracePeriod=2 Nov 22 04:16:33 crc kubenswrapper[4922]: I1122 04:16:33.462273 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:16:33 crc kubenswrapper[4922]: I1122 04:16:33.985156 4922 generic.go:334] "Generic (PLEG): container finished" podID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerID="4cff3fa4cb94f2664cd4d00f8f7ac7a029b8cd46519f76f16d6cd0b0eb880576" exitCode=0 Nov 22 04:16:33 crc kubenswrapper[4922]: I1122 04:16:33.985360 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerDied","Data":"4cff3fa4cb94f2664cd4d00f8f7ac7a029b8cd46519f76f16d6cd0b0eb880576"} Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.138343 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.183463 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt2mm\" (UniqueName: \"kubernetes.io/projected/8323c7d4-0974-48c0-8e32-22f1a5299ec4-kube-api-access-gt2mm\") pod \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.183596 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-catalog-content\") pod \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.183706 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-utilities\") pod \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\" (UID: \"8323c7d4-0974-48c0-8e32-22f1a5299ec4\") " Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.185052 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-utilities" (OuterVolumeSpecName: "utilities") pod "8323c7d4-0974-48c0-8e32-22f1a5299ec4" (UID: "8323c7d4-0974-48c0-8e32-22f1a5299ec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.193115 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8323c7d4-0974-48c0-8e32-22f1a5299ec4-kube-api-access-gt2mm" (OuterVolumeSpecName: "kube-api-access-gt2mm") pod "8323c7d4-0974-48c0-8e32-22f1a5299ec4" (UID: "8323c7d4-0974-48c0-8e32-22f1a5299ec4"). InnerVolumeSpecName "kube-api-access-gt2mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.208961 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8323c7d4-0974-48c0-8e32-22f1a5299ec4" (UID: "8323c7d4-0974-48c0-8e32-22f1a5299ec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.285770 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.285883 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt2mm\" (UniqueName: \"kubernetes.io/projected/8323c7d4-0974-48c0-8e32-22f1a5299ec4-kube-api-access-gt2mm\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:34 crc kubenswrapper[4922]: I1122 04:16:34.285895 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8323c7d4-0974-48c0-8e32-22f1a5299ec4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.002380 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zlbns" Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.002508 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zlbns" event={"ID":"8323c7d4-0974-48c0-8e32-22f1a5299ec4","Type":"ContainerDied","Data":"15928092bf1bb1452e0c4bda6f95597a6bb7f43260161c4ad4f9285411a4baf0"} Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.002759 4922 scope.go:117] "RemoveContainer" containerID="4cff3fa4cb94f2664cd4d00f8f7ac7a029b8cd46519f76f16d6cd0b0eb880576" Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.037192 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zlbns"] Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.045518 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zlbns"] Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.311613 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" path="/var/lib/kubelet/pods/8323c7d4-0974-48c0-8e32-22f1a5299ec4/volumes" Nov 22 04:16:35 crc kubenswrapper[4922]: I1122 04:16:35.770242 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6d6p"] Nov 22 04:16:36 crc kubenswrapper[4922]: I1122 04:16:36.011394 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k6d6p" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="registry-server" containerID="cri-o://9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928" gracePeriod=2 Nov 22 04:16:36 crc kubenswrapper[4922]: E1122 04:16:36.355926 4922 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb76b3b01_0899_47e6_b7e7_0bd4421686b3.slice/crio-conmon-9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:16:37 crc kubenswrapper[4922]: I1122 04:16:37.023773 4922 generic.go:334] "Generic (PLEG): container finished" podID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerID="9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928" exitCode=0 Nov 22 04:16:37 crc kubenswrapper[4922]: I1122 04:16:37.023869 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerDied","Data":"9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928"} Nov 22 04:16:40 crc kubenswrapper[4922]: I1122 04:16:40.301088 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:16:40 crc kubenswrapper[4922]: E1122 04:16:40.301713 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:16:42 crc kubenswrapper[4922]: E1122 04:16:42.249894 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928 is running failed: container process not found" containerID="9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:16:42 crc kubenswrapper[4922]: E1122 04:16:42.254050 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928 is running failed: container process not found" containerID="9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:16:42 crc kubenswrapper[4922]: E1122 04:16:42.254571 4922 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928 is running failed: container process not found" containerID="9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:16:42 crc kubenswrapper[4922]: E1122 04:16:42.254617 4922 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-k6d6p" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="registry-server" Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.340913 4922 scope.go:117] "RemoveContainer" containerID="c4d8494500731759956e7b2b580c4dd078d90b1554a6372b11d8c86d75b35788" Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.833081 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.866073 4922 scope.go:117] "RemoveContainer" containerID="213ecc4dd993c1a04e7a19c4321df756341e09cc7b0fd491d9511b1fe8731a1e" Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.993383 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcszv\" (UniqueName: \"kubernetes.io/projected/b76b3b01-0899-47e6-b7e7-0bd4421686b3-kube-api-access-vcszv\") pod \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.993540 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-utilities\") pod \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.993632 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-catalog-content\") pod \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\" (UID: \"b76b3b01-0899-47e6-b7e7-0bd4421686b3\") " Nov 22 04:16:43 crc kubenswrapper[4922]: I1122 04:16:43.994474 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-utilities" (OuterVolumeSpecName: "utilities") pod "b76b3b01-0899-47e6-b7e7-0bd4421686b3" (UID: "b76b3b01-0899-47e6-b7e7-0bd4421686b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.001991 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76b3b01-0899-47e6-b7e7-0bd4421686b3-kube-api-access-vcszv" (OuterVolumeSpecName: "kube-api-access-vcszv") pod "b76b3b01-0899-47e6-b7e7-0bd4421686b3" (UID: "b76b3b01-0899-47e6-b7e7-0bd4421686b3"). InnerVolumeSpecName "kube-api-access-vcszv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.058514 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b76b3b01-0899-47e6-b7e7-0bd4421686b3" (UID: "b76b3b01-0899-47e6-b7e7-0bd4421686b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.093265 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6d6p" event={"ID":"b76b3b01-0899-47e6-b7e7-0bd4421686b3","Type":"ContainerDied","Data":"1d03834248a4cbaee6c15100b4ac73cb9a60b343971a1219e0cc1864ffe561b6"} Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.093319 4922 scope.go:117] "RemoveContainer" containerID="9b25dd5e881b7411b4630c980a6fd4b3d2d152767e785028971c568c7091d928" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.093608 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6d6p" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.095306 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.095332 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b76b3b01-0899-47e6-b7e7-0bd4421686b3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.095344 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcszv\" (UniqueName: \"kubernetes.io/projected/b76b3b01-0899-47e6-b7e7-0bd4421686b3-kube-api-access-vcszv\") on node \"crc\" DevicePath \"\"" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.128571 4922 scope.go:117] "RemoveContainer" containerID="46924f913af18e0c39558c84613f90c368572433a282383788c92e165bce7642" Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.130916 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k6d6p"] Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.139494 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k6d6p"] Nov 22 04:16:44 crc kubenswrapper[4922]: I1122 04:16:44.149196 4922 scope.go:117] "RemoveContainer" containerID="87dc0df46b0a6cb6b1f4d7903a81b965d0869aeac3b0cb1ed5f46668ee368023" Nov 22 04:16:45 crc kubenswrapper[4922]: I1122 04:16:45.313059 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" path="/var/lib/kubelet/pods/b76b3b01-0899-47e6-b7e7-0bd4421686b3/volumes" Nov 22 04:16:55 crc kubenswrapper[4922]: I1122 04:16:55.314177 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:16:55 crc kubenswrapper[4922]: E1122 04:16:55.315750 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:17:06 crc kubenswrapper[4922]: I1122 04:17:06.301688 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:17:06 crc kubenswrapper[4922]: E1122 04:17:06.302981 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:17:17 crc kubenswrapper[4922]: I1122 04:17:17.301060 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:17:17 crc kubenswrapper[4922]: E1122 04:17:17.302234 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:17:23 crc kubenswrapper[4922]: I1122 04:17:23.564252 4922 generic.go:334] "Generic (PLEG): container finished" podID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" containerID="b1c2b8c23d344aaf282dcdefb4ea2786b54e5e5d3d0db1b71021365b2ee9364a" exitCode=0 Nov 22 04:17:23 crc kubenswrapper[4922]: I1122 04:17:23.564915 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" event={"ID":"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357","Type":"ContainerDied","Data":"b1c2b8c23d344aaf282dcdefb4ea2786b54e5e5d3d0db1b71021365b2ee9364a"} Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.715573 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.754624 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f47g7/crc-debug-tq8rh"] Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.763692 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f47g7/crc-debug-tq8rh"] Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.816620 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbtc5\" (UniqueName: \"kubernetes.io/projected/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-kube-api-access-wbtc5\") pod \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.817318 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-host\") pod \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\" (UID: \"9c84f1ea-b8e3-42e7-9fc0-50c7e8722357\") " Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.817446 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-host" (OuterVolumeSpecName: "host") pod "9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" (UID: "9c84f1ea-b8e3-42e7-9fc0-50c7e8722357"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.818268 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.823157 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-kube-api-access-wbtc5" (OuterVolumeSpecName: "kube-api-access-wbtc5") pod "9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" (UID: "9c84f1ea-b8e3-42e7-9fc0-50c7e8722357"). InnerVolumeSpecName "kube-api-access-wbtc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:24 crc kubenswrapper[4922]: I1122 04:17:24.919788 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbtc5\" (UniqueName: \"kubernetes.io/projected/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357-kube-api-access-wbtc5\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.322084 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" path="/var/lib/kubelet/pods/9c84f1ea-b8e3-42e7-9fc0-50c7e8722357/volumes" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.595383 4922 scope.go:117] "RemoveContainer" containerID="b1c2b8c23d344aaf282dcdefb4ea2786b54e5e5d3d0db1b71021365b2ee9364a" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.595533 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-tq8rh" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978150 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f47g7/crc-debug-rns9l"] Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978492 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="extract-content" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978504 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="extract-content" Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978515 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="extract-utilities" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978521 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="extract-utilities" Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978537 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="extract-content" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978542 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="extract-content" Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978551 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="registry-server" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978557 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="registry-server" Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978585 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" containerName="container-00" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978591 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" containerName="container-00" Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978599 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="registry-server" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978605 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="registry-server" Nov 22 04:17:25 crc kubenswrapper[4922]: E1122 04:17:25.978618 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="extract-utilities" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978623 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="extract-utilities" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978827 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="8323c7d4-0974-48c0-8e32-22f1a5299ec4" containerName="registry-server" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978852 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76b3b01-0899-47e6-b7e7-0bd4421686b3" containerName="registry-server" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.978864 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c84f1ea-b8e3-42e7-9fc0-50c7e8722357" containerName="container-00" Nov 22 04:17:25 crc kubenswrapper[4922]: I1122 04:17:25.979479 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.043764 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkp2\" (UniqueName: \"kubernetes.io/projected/aecc36e9-e540-4eff-94c6-ff9e1994d61a-kube-api-access-bqkp2\") pod \"crc-debug-rns9l\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.043954 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecc36e9-e540-4eff-94c6-ff9e1994d61a-host\") pod \"crc-debug-rns9l\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.146565 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkp2\" (UniqueName: \"kubernetes.io/projected/aecc36e9-e540-4eff-94c6-ff9e1994d61a-kube-api-access-bqkp2\") pod \"crc-debug-rns9l\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.146781 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecc36e9-e540-4eff-94c6-ff9e1994d61a-host\") pod \"crc-debug-rns9l\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.146975 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecc36e9-e540-4eff-94c6-ff9e1994d61a-host\") pod \"crc-debug-rns9l\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.182153 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkp2\" (UniqueName: \"kubernetes.io/projected/aecc36e9-e540-4eff-94c6-ff9e1994d61a-kube-api-access-bqkp2\") pod \"crc-debug-rns9l\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.299292 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.611580 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-rns9l" event={"ID":"aecc36e9-e540-4eff-94c6-ff9e1994d61a","Type":"ContainerStarted","Data":"b0f0cd837dfb9912ddb80af43739b236cd6f2b65cf81e03c9a7a0882ef27011a"} Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.611738 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-rns9l" event={"ID":"aecc36e9-e540-4eff-94c6-ff9e1994d61a","Type":"ContainerStarted","Data":"923ff231fc135a0cc46f5c68a66cbc64eafe07eb385103355cd36c5a0c199e0a"} Nov 22 04:17:26 crc kubenswrapper[4922]: I1122 04:17:26.635986 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f47g7/crc-debug-rns9l" podStartSLOduration=1.635961473 podStartE2EDuration="1.635961473s" podCreationTimestamp="2025-11-22 04:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:17:26.629134669 +0000 UTC m=+5082.667656561" watchObservedRunningTime="2025-11-22 04:17:26.635961473 +0000 UTC m=+5082.674483365" Nov 22 04:17:27 crc kubenswrapper[4922]: I1122 04:17:27.624738 4922 generic.go:334] "Generic (PLEG): container finished" podID="aecc36e9-e540-4eff-94c6-ff9e1994d61a" containerID="b0f0cd837dfb9912ddb80af43739b236cd6f2b65cf81e03c9a7a0882ef27011a" exitCode=0 Nov 22 04:17:27 crc kubenswrapper[4922]: I1122 04:17:27.624987 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-rns9l" event={"ID":"aecc36e9-e540-4eff-94c6-ff9e1994d61a","Type":"ContainerDied","Data":"b0f0cd837dfb9912ddb80af43739b236cd6f2b65cf81e03c9a7a0882ef27011a"} Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.731787 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.791706 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqkp2\" (UniqueName: \"kubernetes.io/projected/aecc36e9-e540-4eff-94c6-ff9e1994d61a-kube-api-access-bqkp2\") pod \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.791799 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecc36e9-e540-4eff-94c6-ff9e1994d61a-host\") pod \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\" (UID: \"aecc36e9-e540-4eff-94c6-ff9e1994d61a\") " Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.792003 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aecc36e9-e540-4eff-94c6-ff9e1994d61a-host" (OuterVolumeSpecName: "host") pod "aecc36e9-e540-4eff-94c6-ff9e1994d61a" (UID: "aecc36e9-e540-4eff-94c6-ff9e1994d61a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.792457 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aecc36e9-e540-4eff-94c6-ff9e1994d61a-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.797055 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecc36e9-e540-4eff-94c6-ff9e1994d61a-kube-api-access-bqkp2" (OuterVolumeSpecName: "kube-api-access-bqkp2") pod "aecc36e9-e540-4eff-94c6-ff9e1994d61a" (UID: "aecc36e9-e540-4eff-94c6-ff9e1994d61a"). InnerVolumeSpecName "kube-api-access-bqkp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:28 crc kubenswrapper[4922]: I1122 04:17:28.893864 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqkp2\" (UniqueName: \"kubernetes.io/projected/aecc36e9-e540-4eff-94c6-ff9e1994d61a-kube-api-access-bqkp2\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:29 crc kubenswrapper[4922]: I1122 04:17:29.304379 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:17:29 crc kubenswrapper[4922]: E1122 04:17:29.305043 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:17:29 crc kubenswrapper[4922]: I1122 04:17:29.540898 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f47g7/crc-debug-rns9l"] Nov 22 04:17:29 crc kubenswrapper[4922]: I1122 04:17:29.546526 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f47g7/crc-debug-rns9l"] Nov 22 04:17:29 crc kubenswrapper[4922]: I1122 04:17:29.645140 4922 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="923ff231fc135a0cc46f5c68a66cbc64eafe07eb385103355cd36c5a0c199e0a" Nov 22 04:17:29 crc kubenswrapper[4922]: I1122 04:17:29.645236 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-rns9l" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.769381 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f47g7/crc-debug-4z2nc"] Nov 22 04:17:30 crc kubenswrapper[4922]: E1122 04:17:30.769826 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecc36e9-e540-4eff-94c6-ff9e1994d61a" containerName="container-00" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.769840 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecc36e9-e540-4eff-94c6-ff9e1994d61a" containerName="container-00" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.770111 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecc36e9-e540-4eff-94c6-ff9e1994d61a" containerName="container-00" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.770990 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.834095 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dp4b\" (UniqueName: \"kubernetes.io/projected/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-kube-api-access-8dp4b\") pod \"crc-debug-4z2nc\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.834433 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-host\") pod \"crc-debug-4z2nc\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.937045 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-host\") pod \"crc-debug-4z2nc\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.937219 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-host\") pod \"crc-debug-4z2nc\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.937251 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dp4b\" (UniqueName: \"kubernetes.io/projected/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-kube-api-access-8dp4b\") pod \"crc-debug-4z2nc\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:30 crc kubenswrapper[4922]: I1122 04:17:30.968122 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dp4b\" (UniqueName: \"kubernetes.io/projected/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-kube-api-access-8dp4b\") pod \"crc-debug-4z2nc\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.097999 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:31 crc kubenswrapper[4922]: W1122 04:17:31.157273 4922 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b49d89b_f754_4012_bbf0_7e2cbdb40fc3.slice/crio-af4feff90240d62309f7e348bd8c12e0184d19b05c4bb2761455d18b17c4f08f WatchSource:0}: Error finding container af4feff90240d62309f7e348bd8c12e0184d19b05c4bb2761455d18b17c4f08f: Status 404 returned error can't find the container with id af4feff90240d62309f7e348bd8c12e0184d19b05c4bb2761455d18b17c4f08f Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.316803 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecc36e9-e540-4eff-94c6-ff9e1994d61a" path="/var/lib/kubelet/pods/aecc36e9-e540-4eff-94c6-ff9e1994d61a/volumes" Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.670525 4922 generic.go:334] "Generic (PLEG): container finished" podID="9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" containerID="ed5848ef4827d8ab51ed68b3b196178b5d9a26a20d3247ecbd2042db9375c8e1" exitCode=0 Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.670626 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-4z2nc" event={"ID":"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3","Type":"ContainerDied","Data":"ed5848ef4827d8ab51ed68b3b196178b5d9a26a20d3247ecbd2042db9375c8e1"} Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.671247 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/crc-debug-4z2nc" event={"ID":"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3","Type":"ContainerStarted","Data":"af4feff90240d62309f7e348bd8c12e0184d19b05c4bb2761455d18b17c4f08f"} Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.725065 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f47g7/crc-debug-4z2nc"] Nov 22 04:17:31 crc kubenswrapper[4922]: I1122 04:17:31.734451 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f47g7/crc-debug-4z2nc"] Nov 22 04:17:32 crc kubenswrapper[4922]: I1122 04:17:32.919208 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:32 crc kubenswrapper[4922]: I1122 04:17:32.982004 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-host\") pod \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " Nov 22 04:17:32 crc kubenswrapper[4922]: I1122 04:17:32.982123 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dp4b\" (UniqueName: \"kubernetes.io/projected/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-kube-api-access-8dp4b\") pod \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\" (UID: \"9b49d89b-f754-4012-bbf0-7e2cbdb40fc3\") " Nov 22 04:17:32 crc kubenswrapper[4922]: I1122 04:17:32.982144 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-host" (OuterVolumeSpecName: "host") pod "9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" (UID: "9b49d89b-f754-4012-bbf0-7e2cbdb40fc3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:17:32 crc kubenswrapper[4922]: I1122 04:17:32.982620 4922 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:32 crc kubenswrapper[4922]: I1122 04:17:32.988044 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-kube-api-access-8dp4b" (OuterVolumeSpecName: "kube-api-access-8dp4b") pod "9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" (UID: "9b49d89b-f754-4012-bbf0-7e2cbdb40fc3"). InnerVolumeSpecName "kube-api-access-8dp4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:17:33 crc kubenswrapper[4922]: I1122 04:17:33.084210 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dp4b\" (UniqueName: \"kubernetes.io/projected/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3-kube-api-access-8dp4b\") on node \"crc\" DevicePath \"\"" Nov 22 04:17:33 crc kubenswrapper[4922]: I1122 04:17:33.313784 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" path="/var/lib/kubelet/pods/9b49d89b-f754-4012-bbf0-7e2cbdb40fc3/volumes" Nov 22 04:17:33 crc kubenswrapper[4922]: I1122 04:17:33.694666 4922 scope.go:117] "RemoveContainer" containerID="ed5848ef4827d8ab51ed68b3b196178b5d9a26a20d3247ecbd2042db9375c8e1" Nov 22 04:17:33 crc kubenswrapper[4922]: I1122 04:17:33.694708 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/crc-debug-4z2nc" Nov 22 04:17:42 crc kubenswrapper[4922]: I1122 04:17:42.299986 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:17:42 crc kubenswrapper[4922]: E1122 04:17:42.300675 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.066471 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c4db8978-gcb6d_78c94acf-643a-4c78-8d2d-525a0d9432cc/barbican-api/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.275254 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-65c4db8978-gcb6d_78c94acf-643a-4c78-8d2d-525a0d9432cc/barbican-api-log/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.328649 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b7d698c78-tqjxq_87ad9c28-b442-49bb-b474-005766f23004/barbican-keystone-listener/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.519529 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cf5c7c74c-qqjc7_ec603ae2-8cba-4c61-8733-b448f538780a/barbican-worker/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.540909 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-cf5c7c74c-qqjc7_ec603ae2-8cba-4c61-8733-b448f538780a/barbican-worker-log/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.545406 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6b7d698c78-tqjxq_87ad9c28-b442-49bb-b474-005766f23004/barbican-keystone-listener-log/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.737516 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rlm7s_9182213e-b8af-4a1b-96a8-ba3439d98d8a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.793532 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085686e3-eda2-407d-8131-076777ea14af/ceilometer-central-agent/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.939463 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085686e3-eda2-407d-8131-076777ea14af/ceilometer-notification-agent/0.log" Nov 22 04:17:49 crc kubenswrapper[4922]: I1122 04:17:49.983074 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085686e3-eda2-407d-8131-076777ea14af/proxy-httpd/0.log" Nov 22 04:17:50 crc kubenswrapper[4922]: I1122 04:17:49.999884 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085686e3-eda2-407d-8131-076777ea14af/sg-core/0.log" Nov 22 04:17:50 crc kubenswrapper[4922]: I1122 04:17:50.168477 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-k2jtw_6844c188-29eb-4010-a96a-427689e010e9/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:50 crc kubenswrapper[4922]: I1122 04:17:50.212688 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-j9xqs_e2f8ad2b-4101-4bfb-b181-bcbff8f80498/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:50 crc kubenswrapper[4922]: I1122 04:17:50.690318 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5/probe/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.016444 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8a1ed907-da19-4420-b5d8-3523a3020796/cinder-api/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.097285 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8a1ed907-da19-4420-b5d8-3523a3020796/cinder-api-log/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.146593 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85809174-7801-455c-8ce6-82f34307147b/cinder-scheduler/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.273534 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_85809174-7801-455c-8ce6-82f34307147b/probe/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.536818 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_743b2d95-20b8-4677-a1fd-6a5eb808628d/probe/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.773921 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2wgdh_152a7129-aafa-4856-b959-18e7fb0d45e4/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:51 crc kubenswrapper[4922]: I1122 04:17:51.994338 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-mrnbm_da35a869-b48f-4972-9a3f-703498998c6d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.218869 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_c3acc4b9-6a1c-4f5c-bc12-8a73b721f5e5/cinder-backup/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.221511 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-9hspz_b009e973-c6d1-4eca-a06a-ed15c5ec10ad/init/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.481217 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-9hspz_b009e973-c6d1-4eca-a06a-ed15c5ec10ad/init/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.553650 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-9hspz_b009e973-c6d1-4eca-a06a-ed15c5ec10ad/dnsmasq-dns/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.677877 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f334bb7-e931-428b-a3b6-c576b9106f7d/glance-log/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.689551 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f334bb7-e931-428b-a3b6-c576b9106f7d/glance-httpd/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.817862 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_161577ba-f585-44ce-9a0e-cf06d8e134f4/glance-httpd/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.870413 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_161577ba-f585-44ce-9a0e-cf06d8e134f4/glance-log/0.log" Nov 22 04:17:52 crc kubenswrapper[4922]: I1122 04:17:52.989832 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-89cb6b448-l5wz8_c0424cb6-8fea-4f3e-a293-27d3d2477c2f/horizon/0.log" Nov 22 04:17:53 crc kubenswrapper[4922]: I1122 04:17:53.638016 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j66gg_fd3a2ee0-d131-4a4f-a0bd-f651cf461b8c/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:53 crc kubenswrapper[4922]: I1122 04:17:53.866485 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mcptc_6a294154-9cc0-47ea-86d5-58e825a739e4/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:53 crc kubenswrapper[4922]: I1122 04:17:53.946742 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29396401-4fvwk_dd803d93-aec0-495c-888c-69bd472ee7b6/keystone-cron/0.log" Nov 22 04:17:54 crc kubenswrapper[4922]: I1122 04:17:54.142193 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-89cb6b448-l5wz8_c0424cb6-8fea-4f3e-a293-27d3d2477c2f/horizon-log/0.log" Nov 22 04:17:54 crc kubenswrapper[4922]: I1122 04:17:54.310427 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a1e5684d-314d-4ad1-940c-96696265b505/kube-state-metrics/0.log" Nov 22 04:17:54 crc kubenswrapper[4922]: I1122 04:17:54.505740 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-7lcvc_f3e6467e-b9e0-4e3f-a718-244e44628def/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:54 crc kubenswrapper[4922]: I1122 04:17:54.800192 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_dbe6f97b-70c1-4581-9367-058568f425b5/manila-api-log/0.log" Nov 22 04:17:54 crc kubenswrapper[4922]: I1122 04:17:54.912456 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_dbe6f97b-70c1-4581-9367-058568f425b5/manila-api/0.log" Nov 22 04:17:54 crc kubenswrapper[4922]: I1122 04:17:54.997381 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d6fddd8cd-k2kd6_629884e5-288f-4eda-a710-d6935610a2ad/keystone-api/0.log" Nov 22 04:17:55 crc kubenswrapper[4922]: I1122 04:17:55.413030 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_21de88bc-c66e-4f93-afbd-b9354b1d7857/probe/0.log" Nov 22 04:17:55 crc kubenswrapper[4922]: I1122 04:17:55.552909 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_21de88bc-c66e-4f93-afbd-b9354b1d7857/manila-scheduler/0.log" Nov 22 04:17:55 crc kubenswrapper[4922]: I1122 04:17:55.683775 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_28375d16-6e26-4490-a1a5-e90290f09e19/manila-share/0.log" Nov 22 04:17:55 crc kubenswrapper[4922]: I1122 04:17:55.732025 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_28375d16-6e26-4490-a1a5-e90290f09e19/probe/0.log" Nov 22 04:17:56 crc kubenswrapper[4922]: I1122 04:17:56.328615 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-655bcfccf7-54vbt_bb592899-6bd7-4d6b-a54d-132c5166df85/neutron-httpd/0.log" Nov 22 04:17:56 crc kubenswrapper[4922]: I1122 04:17:56.434711 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gx2zt_1054bc07-486a-48ac-9199-49bae8794a90/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:56 crc kubenswrapper[4922]: I1122 04:17:56.526342 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-655bcfccf7-54vbt_bb592899-6bd7-4d6b-a54d-132c5166df85/neutron-api/0.log" Nov 22 04:17:57 crc kubenswrapper[4922]: I1122 04:17:57.300214 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:17:57 crc kubenswrapper[4922]: E1122 04:17:57.300514 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:17:57 crc kubenswrapper[4922]: I1122 04:17:57.344181 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_de63ff2e-da3d-4eb5-97a2-4cf6c0b272d3/nova-cell0-conductor-conductor/0.log" Nov 22 04:17:57 crc kubenswrapper[4922]: I1122 04:17:57.678322 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0bccfb44-be98-4a57-9b1d-4e6d11cee15d/nova-api-log/0.log" Nov 22 04:17:57 crc kubenswrapper[4922]: I1122 04:17:57.979220 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_18b44985-01b3-4806-9cc0-bec502d417e4/nova-cell1-conductor-conductor/0.log" Nov 22 04:17:58 crc kubenswrapper[4922]: I1122 04:17:58.226384 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d5aef40f-433d-4415-9c69-50ab337097f0/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 04:17:58 crc kubenswrapper[4922]: I1122 04:17:58.290726 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0bccfb44-be98-4a57-9b1d-4e6d11cee15d/nova-api-api/0.log" Nov 22 04:17:58 crc kubenswrapper[4922]: I1122 04:17:58.517709 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-n8mpg_451dc20b-5cce-4f72-821b-f08403bed351/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:17:58 crc kubenswrapper[4922]: I1122 04:17:58.602631 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c4673860-e095-4170-92b7-cbd2ffdff114/memcached/0.log" Nov 22 04:17:58 crc kubenswrapper[4922]: I1122 04:17:58.637290 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f4f2580-fec5-4d83-80a3-225f0bcc355a/nova-metadata-log/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.072946 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cd673a7b-3830-4c59-bf58-9d6d675f6e40/nova-scheduler-scheduler/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.104087 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_60c32179-6b0e-4e8b-a101-81ca49be2034/mysql-bootstrap/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.298893 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_60c32179-6b0e-4e8b-a101-81ca49be2034/mysql-bootstrap/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.364906 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_60c32179-6b0e-4e8b-a101-81ca49be2034/galera/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.640182 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_674794c8-5fae-461e-91a1-f3f44a088e55/mysql-bootstrap/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.786429 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_743b2d95-20b8-4677-a1fd-6a5eb808628d/cinder-volume/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.852913 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_674794c8-5fae-461e-91a1-f3f44a088e55/mysql-bootstrap/0.log" Nov 22 04:17:59 crc kubenswrapper[4922]: I1122 04:17:59.875308 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_674794c8-5fae-461e-91a1-f3f44a088e55/galera/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.025922 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8d23126d-97a4-4ed0-a589-0ef607e832ed/openstackclient/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.122561 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hdkg5_5043af03-d8a1-4437-9ab8-78907d742588/openstack-network-exporter/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.264250 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nlzww_5c038335-42ee-4618-a14b-b32bc0f1d53a/ovn-controller/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.321605 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4cpkr_3882abf6-0110-46fd-b498-de1d56838fc8/ovsdb-server-init/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.352749 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f4f2580-fec5-4d83-80a3-225f0bcc355a/nova-metadata-metadata/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.526300 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4cpkr_3882abf6-0110-46fd-b498-de1d56838fc8/ovs-vswitchd/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.552676 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4cpkr_3882abf6-0110-46fd-b498-de1d56838fc8/ovsdb-server/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.577518 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-4cpkr_3882abf6-0110-46fd-b498-de1d56838fc8/ovsdb-server-init/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.614744 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6l4jc_c5159e8a-3369-45c5-b6e1-a45e3e49a228/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.776394 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_59de3215-8a02-4c17-9ccc-395c94f69512/openstack-network-exporter/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.816102 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_59de3215-8a02-4c17-9ccc-395c94f69512/ovn-northd/0.log" Nov 22 04:18:00 crc kubenswrapper[4922]: I1122 04:18:00.884906 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0cb9d221-16e8-421b-b044-a416405d01c1/openstack-network-exporter/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.014744 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0cb9d221-16e8-421b-b044-a416405d01c1/ovsdbserver-nb/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.035554 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_67a38fd3-5643-4795-8a88-f21d3ff7b43a/openstack-network-exporter/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.042610 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_67a38fd3-5643-4795-8a88-f21d3ff7b43a/ovsdbserver-sb/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.316524 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_57a51c5d-616f-49ef-b320-e3ad9238cf44/setup-container/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.328745 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56c6fc5546-zz2lj_3db3f190-2b55-424f-bbe9-d52042f900ef/placement-api/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.357258 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56c6fc5546-zz2lj_3db3f190-2b55-424f-bbe9-d52042f900ef/placement-log/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.494204 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_57a51c5d-616f-49ef-b320-e3ad9238cf44/rabbitmq/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.505661 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_57a51c5d-616f-49ef-b320-e3ad9238cf44/setup-container/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.558408 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_22048c2c-fb84-4f52-9868-c6f6074fab42/setup-container/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.731615 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_22048c2c-fb84-4f52-9868-c6f6074fab42/setup-container/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.749617 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-npqqk_b00860c0-4679-4cee-9185-94524381a6da/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.756467 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_22048c2c-fb84-4f52-9868-c6f6074fab42/rabbitmq/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.921086 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-4skwt_a86d36d0-b5ca-4b97-89b6-3af2f942140f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:18:01 crc kubenswrapper[4922]: I1122 04:18:01.960374 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-hnnvm_85670af6-7929-40f9-8cb6-ef764c147917/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:18:02 crc kubenswrapper[4922]: I1122 04:18:02.473578 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-b5522_4010b4bb-bf3c-401b-ab08-bb238b56934a/ssh-known-hosts-edpm-deployment/0.log" Nov 22 04:18:02 crc kubenswrapper[4922]: I1122 04:18:02.548243 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65f03af2-87a1-4f4f-b09c-00fe2a3d4943/tempest-tests-tempest-tests-runner/0.log" Nov 22 04:18:02 crc kubenswrapper[4922]: I1122 04:18:02.857053 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5459cbc2-5aa8-462f-aa76-8c6c55369173/test-operator-logs-container/0.log" Nov 22 04:18:02 crc kubenswrapper[4922]: I1122 04:18:02.940720 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b4t92_ffcbca43-e19c-4f81-8712-18271858ace9/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 22 04:18:09 crc kubenswrapper[4922]: I1122 04:18:09.301711 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:18:09 crc kubenswrapper[4922]: E1122 04:18:09.302917 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:18:23 crc kubenswrapper[4922]: I1122 04:18:23.301797 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:18:23 crc kubenswrapper[4922]: E1122 04:18:23.302922 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.291233 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/util/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.462463 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/util/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.475043 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/pull/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.481441 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/pull/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.702144 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/pull/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.730405 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/extract/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.731066 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1ef56ccb6d133047ae178fe17bc2d57b31708a59c7aeccefc8f9c21e077nwnp_4ae603e3-d217-41c7-9b5f-d3a6e4d8ab5b/util/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.864098 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-4lfxl_181257ea-c4b9-4370-80b5-7f52ed557c33/kube-rbac-proxy/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.938966 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-864d88ccf8-vc6p7_8b206938-2e76-40b1-b39c-ff333430e8f6/kube-rbac-proxy/0.log" Nov 22 04:18:25 crc kubenswrapper[4922]: I1122 04:18:25.983461 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-4lfxl_181257ea-c4b9-4370-80b5-7f52ed557c33/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.102769 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-864d88ccf8-vc6p7_8b206938-2e76-40b1-b39c-ff333430e8f6/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.118653 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-dkfth_5db39a57-6021-466f-84e0-1fc2e8cf0da9/kube-rbac-proxy/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.178253 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-dkfth_5db39a57-6021-466f-84e0-1fc2e8cf0da9/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.321700 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6bd966bbd4-g9fbb_1cc9e6ed-fd1a-4280-8867-c8fbd326ca14/kube-rbac-proxy/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.392501 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-6bd966bbd4-g9fbb_1cc9e6ed-fd1a-4280-8867-c8fbd326ca14/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.526046 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-rxzs6_3e52f719-cfcb-48d8-a83f-1bcddb08e6bd/kube-rbac-proxy/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.576904 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-rxzs6_3e52f719-cfcb-48d8-a83f-1bcddb08e6bd/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.637109 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-v6hmr_8262c40a-af33-42fb-9347-e5d84f97a20d/kube-rbac-proxy/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.715458 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-v6hmr_8262c40a-af33-42fb-9347-e5d84f97a20d/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.786501 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6c55d8d69b-twhcc_ec6a579d-cf65-4b02-a891-ca17161e6585/kube-rbac-proxy/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.942084 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-vg7qw_0f9c4cd6-8ab3-4895-ab12-74dce3828cf8/kube-rbac-proxy/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.994664 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-vg7qw_0f9c4cd6-8ab3-4895-ab12-74dce3828cf8/manager/0.log" Nov 22 04:18:26 crc kubenswrapper[4922]: I1122 04:18:26.996190 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6c55d8d69b-twhcc_ec6a579d-cf65-4b02-a891-ca17161e6585/manager/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.196143 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d6f5d799-5szbr_4bc61f3e-c538-4a90-84da-7cd4760621f1/kube-rbac-proxy/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.270940 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d6f5d799-5szbr_4bc61f3e-c538-4a90-84da-7cd4760621f1/manager/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.361954 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-646fd589f9-f7bz2_ddb44566-f024-43bd-8bc8-7b497606baa7/kube-rbac-proxy/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.459225 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-h7jpp_459b1df7-6ed9-4ef4-bb71-aa7e82001d5a/kube-rbac-proxy/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.478105 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-646fd589f9-f7bz2_ddb44566-f024-43bd-8bc8-7b497606baa7/manager/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.580378 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-h7jpp_459b1df7-6ed9-4ef4-bb71-aa7e82001d5a/manager/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.633818 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b6c55ffd5-nlwm2_c4755f19-1e55-41bb-be1e-b4b868c48cc1/kube-rbac-proxy/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.752766 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6b6c55ffd5-nlwm2_c4755f19-1e55-41bb-be1e-b4b868c48cc1/manager/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.866516 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-2ldtk_cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd/kube-rbac-proxy/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.936686 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-2ldtk_cf5db45c-dbc8-4774-ad28-dc4aaa24b9fd/manager/0.log" Nov 22 04:18:27 crc kubenswrapper[4922]: I1122 04:18:27.976449 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7979c68bc7-7fxg5_225f6b3a-93d7-46d9-99a1-d9787b4921fb/kube-rbac-proxy/0.log" Nov 22 04:18:28 crc kubenswrapper[4922]: I1122 04:18:28.073249 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7979c68bc7-7fxg5_225f6b3a-93d7-46d9-99a1-d9787b4921fb/manager/0.log" Nov 22 04:18:28 crc kubenswrapper[4922]: I1122 04:18:28.122282 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-dsmmt_4134b0be-83c2-452c-a09e-6a699543d2c0/kube-rbac-proxy/0.log" Nov 22 04:18:28 crc kubenswrapper[4922]: I1122 04:18:28.127113 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-dsmmt_4134b0be-83c2-452c-a09e-6a699543d2c0/manager/0.log" Nov 22 04:18:28 crc kubenswrapper[4922]: I1122 04:18:28.917169 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-748d5b5d8d-h2b67_302e7f75-5537-48f3-9a19-0540310929da/kube-rbac-proxy/0.log" Nov 22 04:18:28 crc kubenswrapper[4922]: I1122 04:18:28.944820 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8495cbd6cf-sz4cl_0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b/kube-rbac-proxy/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.176736 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-tnkdx_b3a11ecb-39ed-423e-b79b-19694d816305/registry-server/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.186144 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8495cbd6cf-sz4cl_0c4effce-3b89-4b2b-bf5b-e8d6139b7d6b/operator/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.398882 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-vkmzq_301d0455-5622-4810-847e-b354cf6f9c00/kube-rbac-proxy/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.415315 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-vkmzq_301d0455-5622-4810-847e-b354cf6f9c00/manager/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.462182 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-5fntg_562a15b4-c659-490c-88a4-1db388e0224f/kube-rbac-proxy/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.691284 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-5fntg_562a15b4-c659-490c-88a4-1db388e0224f/manager/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.727510 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-cd74r_261901bb-e399-412c-a57f-4fefa2a3bfc0/operator/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.969311 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-cc9f5bc5c-h7svq_e92bf172-fec5-4847-9ba1-9e3ddc58c7c3/kube-rbac-proxy/0.log" Nov 22 04:18:29 crc kubenswrapper[4922]: I1122 04:18:29.982207 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58487d9bf4-tg4ph_ef5e108a-748a-47ab-b0fc-0a3e303a09ba/kube-rbac-proxy/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.016420 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-cc9f5bc5c-h7svq_e92bf172-fec5-4847-9ba1-9e3ddc58c7c3/manager/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.239960 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-748d5b5d8d-h2b67_302e7f75-5537-48f3-9a19-0540310929da/manager/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.552530 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58487d9bf4-tg4ph_ef5e108a-748a-47ab-b0fc-0a3e303a09ba/manager/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.564428 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-77db6bf9c-hn2fj_b5d110f6-5ffb-46bc-b263-e7142f463974/manager/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.621161 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-77db6bf9c-hn2fj_b5d110f6-5ffb-46bc-b263-e7142f463974/kube-rbac-proxy/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.750964 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b56b8849f-m5xsr_0bcf2061-04d3-4819-b07e-0eaaf4bb6287/manager/0.log" Nov 22 04:18:30 crc kubenswrapper[4922]: I1122 04:18:30.752241 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b56b8849f-m5xsr_0bcf2061-04d3-4819-b07e-0eaaf4bb6287/kube-rbac-proxy/0.log" Nov 22 04:18:35 crc kubenswrapper[4922]: I1122 04:18:35.306524 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:18:35 crc kubenswrapper[4922]: E1122 04:18:35.307360 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:18:47 crc kubenswrapper[4922]: I1122 04:18:47.242753 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xlb7z_62cd80a8-6faf-48b7-bf44-3b181afd66c6/control-plane-machine-set-operator/0.log" Nov 22 04:18:47 crc kubenswrapper[4922]: I1122 04:18:47.301104 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:18:47 crc kubenswrapper[4922]: E1122 04:18:47.301372 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:18:47 crc kubenswrapper[4922]: I1122 04:18:47.391028 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qqgzd_9109383b-40f8-49d7-a601-1d048c4d8686/kube-rbac-proxy/0.log" Nov 22 04:18:47 crc kubenswrapper[4922]: I1122 04:18:47.453617 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qqgzd_9109383b-40f8-49d7-a601-1d048c4d8686/machine-api-operator/0.log" Nov 22 04:19:01 crc kubenswrapper[4922]: I1122 04:19:01.126869 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-zm8rg_45c21c53-9955-4d09-8b9c-668a96ecab5a/cert-manager-controller/0.log" Nov 22 04:19:01 crc kubenswrapper[4922]: I1122 04:19:01.301885 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:19:01 crc kubenswrapper[4922]: E1122 04:19:01.302351 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:19:01 crc kubenswrapper[4922]: I1122 04:19:01.324702 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-x94nm_451f8b82-3178-4e4c-b134-32bea43520e0/cert-manager-webhook/0.log" Nov 22 04:19:01 crc kubenswrapper[4922]: I1122 04:19:01.335504 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-gndzc_b14967da-15c3-4419-93fb-5bbc85265835/cert-manager-cainjector/0.log" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.301117 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:19:16 crc kubenswrapper[4922]: E1122 04:19:16.301812 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.366197 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-bb9qb_24a6482e-6e30-42ca-9c56-ca6bb2772d41/nmstate-console-plugin/0.log" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.502697 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-f2mdj_352d437e-9254-4db9-a771-fae8060c3c84/nmstate-handler/0.log" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.567862 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-rb9pl_6d67e2c3-dc43-4809-bde2-1252e775b32d/kube-rbac-proxy/0.log" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.650066 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-rb9pl_6d67e2c3-dc43-4809-bde2-1252e775b32d/nmstate-metrics/0.log" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.734651 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-xksq7_21bed758-674a-4d6f-9909-62147fd6d1b9/nmstate-operator/0.log" Nov 22 04:19:16 crc kubenswrapper[4922]: I1122 04:19:16.874377 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-9d8bx_9cdd2d56-0dc2-4e74-81ed-d22f94a88db9/nmstate-webhook/0.log" Nov 22 04:19:31 crc kubenswrapper[4922]: I1122 04:19:31.300537 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:19:31 crc kubenswrapper[4922]: E1122 04:19:31.301567 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:19:31 crc kubenswrapper[4922]: I1122 04:19:31.889294 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-h2kgl_513892d3-9d82-48cf-911c-857d8c2a8a95/kube-rbac-proxy/0.log" Nov 22 04:19:31 crc kubenswrapper[4922]: I1122 04:19:31.971243 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-h2kgl_513892d3-9d82-48cf-911c-857d8c2a8a95/controller/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.062840 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-frr-files/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.275214 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-frr-files/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.295861 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-metrics/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.300759 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-reloader/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.336507 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-reloader/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.457869 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-frr-files/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.501914 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-metrics/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.513448 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-metrics/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.513804 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-reloader/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.665923 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-frr-files/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.674477 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-metrics/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.694715 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/cp-reloader/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.741388 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/controller/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.894360 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/frr-metrics/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.919554 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/kube-rbac-proxy/0.log" Nov 22 04:19:32 crc kubenswrapper[4922]: I1122 04:19:32.931453 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/kube-rbac-proxy-frr/0.log" Nov 22 04:19:33 crc kubenswrapper[4922]: I1122 04:19:33.091933 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/reloader/0.log" Nov 22 04:19:33 crc kubenswrapper[4922]: I1122 04:19:33.142117 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-8zlpf_f06b9437-16f5-486b-88a2-1c475e99e21a/frr-k8s-webhook-server/0.log" Nov 22 04:19:33 crc kubenswrapper[4922]: I1122 04:19:33.319465 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d967d77f6-659pj_621d00a4-49ba-4725-b00b-72e6ed7521ad/manager/0.log" Nov 22 04:19:33 crc kubenswrapper[4922]: I1122 04:19:33.523812 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-748dcb78f6-gcbtx_a7a3b01f-9dd6-43b2-8ef3-8a1443c1bfc9/webhook-server/0.log" Nov 22 04:19:33 crc kubenswrapper[4922]: I1122 04:19:33.640978 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tl5mq_f607a020-54f7-4888-8a09-6caede7a160c/kube-rbac-proxy/0.log" Nov 22 04:19:34 crc kubenswrapper[4922]: I1122 04:19:34.267323 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tl5mq_f607a020-54f7-4888-8a09-6caede7a160c/speaker/0.log" Nov 22 04:19:34 crc kubenswrapper[4922]: I1122 04:19:34.489486 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bjqlk_85602553-aa6e-40ba-b92b-96b851a002ca/frr/0.log" Nov 22 04:19:46 crc kubenswrapper[4922]: I1122 04:19:46.300663 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:19:46 crc kubenswrapper[4922]: E1122 04:19:46.301740 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:19:48 crc kubenswrapper[4922]: I1122 04:19:48.860582 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/util/0.log" Nov 22 04:19:49 crc kubenswrapper[4922]: I1122 04:19:49.519374 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/util/0.log" Nov 22 04:19:49 crc kubenswrapper[4922]: I1122 04:19:49.623943 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/pull/0.log" Nov 22 04:19:49 crc kubenswrapper[4922]: I1122 04:19:49.692757 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/pull/0.log" Nov 22 04:19:49 crc kubenswrapper[4922]: I1122 04:19:49.948485 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/util/0.log" Nov 22 04:19:49 crc kubenswrapper[4922]: I1122 04:19:49.949502 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/pull/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.007552 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e24wcx_54673aca-5f82-42ac-91d8-036b789061dc/extract/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.106282 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/extract-utilities/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.301322 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/extract-utilities/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.339479 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/extract-content/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.339600 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/extract-content/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.500528 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/extract-content/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.549582 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/extract-utilities/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.685030 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/extract-utilities/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.954693 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/extract-utilities/0.log" Nov 22 04:19:50 crc kubenswrapper[4922]: I1122 04:19:50.964828 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/extract-content/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.024943 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/extract-content/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.188976 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/extract-utilities/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.219933 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b5xxh_4b1933fa-b2c4-4fbd-855a-d434ffdfd8fe/registry-server/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.229449 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/extract-content/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.405811 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/util/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.604808 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/util/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.640920 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/pull/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.694208 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/pull/0.log" Nov 22 04:19:51 crc kubenswrapper[4922]: I1122 04:19:51.914741 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/pull/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.015388 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/extract/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.036933 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6m9cm4_7405653e-5aef-4e96-9140-a011670ace50/util/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.059592 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llzn6_b8263434-57ca-4230-850a-ae927db99cb8/registry-server/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.227616 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fjwv8_2464e274-acb6-4ae6-aafb-c76c1a3a9ef0/marketplace-operator/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.237954 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/extract-utilities/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.439155 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/extract-utilities/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.440051 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/extract-content/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.444591 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/extract-content/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.591205 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/extract-utilities/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.618081 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/extract-content/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.777410 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/extract-utilities/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.796644 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pdqrh_5d6b66b3-7949-46a0-9242-2ce57ca56ecd/registry-server/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.956438 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/extract-content/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.975810 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/extract-utilities/0.log" Nov 22 04:19:52 crc kubenswrapper[4922]: I1122 04:19:52.988517 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/extract-content/0.log" Nov 22 04:19:53 crc kubenswrapper[4922]: I1122 04:19:53.137010 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/extract-utilities/0.log" Nov 22 04:19:53 crc kubenswrapper[4922]: I1122 04:19:53.178158 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/extract-content/0.log" Nov 22 04:19:53 crc kubenswrapper[4922]: I1122 04:19:53.649627 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rhmvs_01170b3c-6c7d-4aee-9016-518e2d155464/registry-server/0.log" Nov 22 04:20:00 crc kubenswrapper[4922]: I1122 04:20:00.301237 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:20:00 crc kubenswrapper[4922]: E1122 04:20:00.302090 4922 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-b9j6n_openshift-machine-config-operator(402683b1-a29f-4a79-a36c-daf6e8068d0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" Nov 22 04:20:14 crc kubenswrapper[4922]: I1122 04:20:14.300792 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:20:15 crc kubenswrapper[4922]: I1122 04:20:15.179534 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"a74e353481fe56864ad7df36675d5d71d34722e65b4b0c9aa8011911015802bd"} Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.201741 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cg56x"] Nov 22 04:20:39 crc kubenswrapper[4922]: E1122 04:20:39.202802 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" containerName="container-00" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.202819 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" containerName="container-00" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.203173 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b49d89b-f754-4012-bbf0-7e2cbdb40fc3" containerName="container-00" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.207643 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.227289 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cg56x"] Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.367460 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-utilities\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.367514 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-catalog-content\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.367882 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvfjf\" (UniqueName: \"kubernetes.io/projected/688b3dc1-8a23-4914-83bc-a39360e4fd70-kube-api-access-rvfjf\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.470192 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvfjf\" (UniqueName: \"kubernetes.io/projected/688b3dc1-8a23-4914-83bc-a39360e4fd70-kube-api-access-rvfjf\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.470306 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-utilities\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.470328 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-catalog-content\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.470780 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-catalog-content\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.470970 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-utilities\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.490246 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvfjf\" (UniqueName: \"kubernetes.io/projected/688b3dc1-8a23-4914-83bc-a39360e4fd70-kube-api-access-rvfjf\") pod \"redhat-operators-cg56x\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:39 crc kubenswrapper[4922]: I1122 04:20:39.541547 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:40 crc kubenswrapper[4922]: I1122 04:20:40.018737 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cg56x"] Nov 22 04:20:40 crc kubenswrapper[4922]: I1122 04:20:40.429543 4922 generic.go:334] "Generic (PLEG): container finished" podID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerID="723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03" exitCode=0 Nov 22 04:20:40 crc kubenswrapper[4922]: I1122 04:20:40.429657 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerDied","Data":"723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03"} Nov 22 04:20:40 crc kubenswrapper[4922]: I1122 04:20:40.429916 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerStarted","Data":"f5b3740e320f4a66bab10cc97ea98db6ab08d77a9876144b75f3d4962ea69e6d"} Nov 22 04:20:41 crc kubenswrapper[4922]: I1122 04:20:41.440213 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerStarted","Data":"0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814"} Nov 22 04:20:45 crc kubenswrapper[4922]: I1122 04:20:45.483803 4922 generic.go:334] "Generic (PLEG): container finished" podID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerID="0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814" exitCode=0 Nov 22 04:20:45 crc kubenswrapper[4922]: I1122 04:20:45.485203 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerDied","Data":"0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814"} Nov 22 04:20:47 crc kubenswrapper[4922]: I1122 04:20:47.517820 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerStarted","Data":"18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a"} Nov 22 04:20:47 crc kubenswrapper[4922]: I1122 04:20:47.542871 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cg56x" podStartSLOduration=3.07936539 podStartE2EDuration="8.542831752s" podCreationTimestamp="2025-11-22 04:20:39 +0000 UTC" firstStartedPulling="2025-11-22 04:20:40.431340307 +0000 UTC m=+5276.469862199" lastFinishedPulling="2025-11-22 04:20:45.894806659 +0000 UTC m=+5281.933328561" observedRunningTime="2025-11-22 04:20:47.533819986 +0000 UTC m=+5283.572341888" watchObservedRunningTime="2025-11-22 04:20:47.542831752 +0000 UTC m=+5283.581353644" Nov 22 04:20:49 crc kubenswrapper[4922]: I1122 04:20:49.542218 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:49 crc kubenswrapper[4922]: I1122 04:20:49.543247 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:50 crc kubenswrapper[4922]: I1122 04:20:50.595159 4922 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cg56x" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="registry-server" probeResult="failure" output=< Nov 22 04:20:50 crc kubenswrapper[4922]: timeout: failed to connect service ":50051" within 1s Nov 22 04:20:50 crc kubenswrapper[4922]: > Nov 22 04:20:59 crc kubenswrapper[4922]: I1122 04:20:59.626712 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:59 crc kubenswrapper[4922]: I1122 04:20:59.709035 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:20:59 crc kubenswrapper[4922]: I1122 04:20:59.868173 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cg56x"] Nov 22 04:21:00 crc kubenswrapper[4922]: I1122 04:21:00.676456 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cg56x" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="registry-server" containerID="cri-o://18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a" gracePeriod=2 Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.216598 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.364942 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-catalog-content\") pod \"688b3dc1-8a23-4914-83bc-a39360e4fd70\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.365038 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-utilities\") pod \"688b3dc1-8a23-4914-83bc-a39360e4fd70\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.365232 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvfjf\" (UniqueName: \"kubernetes.io/projected/688b3dc1-8a23-4914-83bc-a39360e4fd70-kube-api-access-rvfjf\") pod \"688b3dc1-8a23-4914-83bc-a39360e4fd70\" (UID: \"688b3dc1-8a23-4914-83bc-a39360e4fd70\") " Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.366015 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-utilities" (OuterVolumeSpecName: "utilities") pod "688b3dc1-8a23-4914-83bc-a39360e4fd70" (UID: "688b3dc1-8a23-4914-83bc-a39360e4fd70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.370698 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688b3dc1-8a23-4914-83bc-a39360e4fd70-kube-api-access-rvfjf" (OuterVolumeSpecName: "kube-api-access-rvfjf") pod "688b3dc1-8a23-4914-83bc-a39360e4fd70" (UID: "688b3dc1-8a23-4914-83bc-a39360e4fd70"). InnerVolumeSpecName "kube-api-access-rvfjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.443434 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "688b3dc1-8a23-4914-83bc-a39360e4fd70" (UID: "688b3dc1-8a23-4914-83bc-a39360e4fd70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.467763 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvfjf\" (UniqueName: \"kubernetes.io/projected/688b3dc1-8a23-4914-83bc-a39360e4fd70-kube-api-access-rvfjf\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.467795 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.467806 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/688b3dc1-8a23-4914-83bc-a39360e4fd70-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.689314 4922 generic.go:334] "Generic (PLEG): container finished" podID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerID="18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a" exitCode=0 Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.689390 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cg56x" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.689390 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerDied","Data":"18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a"} Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.689995 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cg56x" event={"ID":"688b3dc1-8a23-4914-83bc-a39360e4fd70","Type":"ContainerDied","Data":"f5b3740e320f4a66bab10cc97ea98db6ab08d77a9876144b75f3d4962ea69e6d"} Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.690058 4922 scope.go:117] "RemoveContainer" containerID="18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.727301 4922 scope.go:117] "RemoveContainer" containerID="0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.741035 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cg56x"] Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.754878 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cg56x"] Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.761438 4922 scope.go:117] "RemoveContainer" containerID="723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.802817 4922 scope.go:117] "RemoveContainer" containerID="18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a" Nov 22 04:21:01 crc kubenswrapper[4922]: E1122 04:21:01.803394 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a\": container with ID starting with 18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a not found: ID does not exist" containerID="18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.803431 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a"} err="failed to get container status \"18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a\": rpc error: code = NotFound desc = could not find container \"18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a\": container with ID starting with 18cfa8cc6d9d8dde4fe00a1175b37e43aa31b903d5aca4a4b2e891f8587d9b3a not found: ID does not exist" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.803451 4922 scope.go:117] "RemoveContainer" containerID="0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814" Nov 22 04:21:01 crc kubenswrapper[4922]: E1122 04:21:01.803887 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814\": container with ID starting with 0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814 not found: ID does not exist" containerID="0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.803905 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814"} err="failed to get container status \"0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814\": rpc error: code = NotFound desc = could not find container \"0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814\": container with ID starting with 0d6af5021e9a8a057ac843854a3b4d563d28253cb21f9e3cb0674cd355bee814 not found: ID does not exist" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.803919 4922 scope.go:117] "RemoveContainer" containerID="723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03" Nov 22 04:21:01 crc kubenswrapper[4922]: E1122 04:21:01.804308 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03\": container with ID starting with 723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03 not found: ID does not exist" containerID="723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03" Nov 22 04:21:01 crc kubenswrapper[4922]: I1122 04:21:01.804331 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03"} err="failed to get container status \"723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03\": rpc error: code = NotFound desc = could not find container \"723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03\": container with ID starting with 723af9f3e7f975fa5c7fc3ec212b4882b5713b26d0e32cd2802b80a69b34af03 not found: ID does not exist" Nov 22 04:21:03 crc kubenswrapper[4922]: I1122 04:21:03.311529 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" path="/var/lib/kubelet/pods/688b3dc1-8a23-4914-83bc-a39360e4fd70/volumes" Nov 22 04:21:51 crc kubenswrapper[4922]: I1122 04:21:51.273667 4922 generic.go:334] "Generic (PLEG): container finished" podID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerID="cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534" exitCode=0 Nov 22 04:21:51 crc kubenswrapper[4922]: I1122 04:21:51.273806 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f47g7/must-gather-fpdpf" event={"ID":"30b7ec5b-cd88-4c4e-944f-415cbf9241ae","Type":"ContainerDied","Data":"cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534"} Nov 22 04:21:51 crc kubenswrapper[4922]: I1122 04:21:51.274826 4922 scope.go:117] "RemoveContainer" containerID="cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534" Nov 22 04:21:52 crc kubenswrapper[4922]: I1122 04:21:52.021930 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f47g7_must-gather-fpdpf_30b7ec5b-cd88-4c4e-944f-415cbf9241ae/gather/0.log" Nov 22 04:21:59 crc kubenswrapper[4922]: I1122 04:21:59.972740 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f47g7/must-gather-fpdpf"] Nov 22 04:21:59 crc kubenswrapper[4922]: I1122 04:21:59.973587 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f47g7/must-gather-fpdpf" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="copy" containerID="cri-o://8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886" gracePeriod=2 Nov 22 04:21:59 crc kubenswrapper[4922]: I1122 04:21:59.982674 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f47g7/must-gather-fpdpf"] Nov 22 04:22:00 crc kubenswrapper[4922]: I1122 04:22:00.975647 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f47g7_must-gather-fpdpf_30b7ec5b-cd88-4c4e-944f-415cbf9241ae/copy/0.log" Nov 22 04:22:00 crc kubenswrapper[4922]: I1122 04:22:00.977305 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.080333 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-must-gather-output\") pod \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.080751 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jhw6\" (UniqueName: \"kubernetes.io/projected/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-kube-api-access-2jhw6\") pod \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\" (UID: \"30b7ec5b-cd88-4c4e-944f-415cbf9241ae\") " Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.131024 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-kube-api-access-2jhw6" (OuterVolumeSpecName: "kube-api-access-2jhw6") pod "30b7ec5b-cd88-4c4e-944f-415cbf9241ae" (UID: "30b7ec5b-cd88-4c4e-944f-415cbf9241ae"). InnerVolumeSpecName "kube-api-access-2jhw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.184106 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jhw6\" (UniqueName: \"kubernetes.io/projected/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-kube-api-access-2jhw6\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.359946 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "30b7ec5b-cd88-4c4e-944f-415cbf9241ae" (UID: "30b7ec5b-cd88-4c4e-944f-415cbf9241ae"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.392729 4922 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/30b7ec5b-cd88-4c4e-944f-415cbf9241ae-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.408766 4922 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f47g7_must-gather-fpdpf_30b7ec5b-cd88-4c4e-944f-415cbf9241ae/copy/0.log" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.409659 4922 generic.go:334] "Generic (PLEG): container finished" podID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerID="8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886" exitCode=143 Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.409724 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f47g7/must-gather-fpdpf" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.409733 4922 scope.go:117] "RemoveContainer" containerID="8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.427867 4922 scope.go:117] "RemoveContainer" containerID="cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.500995 4922 scope.go:117] "RemoveContainer" containerID="8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886" Nov 22 04:22:01 crc kubenswrapper[4922]: E1122 04:22:01.501514 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886\": container with ID starting with 8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886 not found: ID does not exist" containerID="8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.501574 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886"} err="failed to get container status \"8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886\": rpc error: code = NotFound desc = could not find container \"8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886\": container with ID starting with 8e97539363d1859f985c06cbe010c7e6a6f7fe506b1e37e1a4aacef226ded886 not found: ID does not exist" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.501604 4922 scope.go:117] "RemoveContainer" containerID="cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534" Nov 22 04:22:01 crc kubenswrapper[4922]: E1122 04:22:01.502090 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534\": container with ID starting with cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534 not found: ID does not exist" containerID="cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534" Nov 22 04:22:01 crc kubenswrapper[4922]: I1122 04:22:01.502124 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534"} err="failed to get container status \"cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534\": rpc error: code = NotFound desc = could not find container \"cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534\": container with ID starting with cf9b921ee169ae6034d2f1ce11897e8fa947f4add68de67466d39d15cde25534 not found: ID does not exist" Nov 22 04:22:03 crc kubenswrapper[4922]: I1122 04:22:03.315226 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" path="/var/lib/kubelet/pods/30b7ec5b-cd88-4c4e-944f-415cbf9241ae/volumes" Nov 22 04:22:41 crc kubenswrapper[4922]: I1122 04:22:41.109328 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:22:41 crc kubenswrapper[4922]: I1122 04:22:41.109934 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:23:11 crc kubenswrapper[4922]: I1122 04:23:11.109360 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:23:11 crc kubenswrapper[4922]: I1122 04:23:11.110073 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.110113 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.110996 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.111088 4922 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.112366 4922 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a74e353481fe56864ad7df36675d5d71d34722e65b4b0c9aa8011911015802bd"} pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.112468 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" containerID="cri-o://a74e353481fe56864ad7df36675d5d71d34722e65b4b0c9aa8011911015802bd" gracePeriod=600 Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.628125 4922 generic.go:334] "Generic (PLEG): container finished" podID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerID="a74e353481fe56864ad7df36675d5d71d34722e65b4b0c9aa8011911015802bd" exitCode=0 Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.628226 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerDied","Data":"a74e353481fe56864ad7df36675d5d71d34722e65b4b0c9aa8011911015802bd"} Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.629400 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" event={"ID":"402683b1-a29f-4a79-a36c-daf6e8068d0d","Type":"ContainerStarted","Data":"65ea22420ffd995a6916a12a3b58e1a72287f44205c74f513491277b0a884ab3"} Nov 22 04:23:41 crc kubenswrapper[4922]: I1122 04:23:41.629468 4922 scope.go:117] "RemoveContainer" containerID="e1ed9e2d5f82f333d68ec19db77109001150ddfdec3bd44077036c7c173bdde5" Nov 22 04:24:08 crc kubenswrapper[4922]: I1122 04:24:08.982563 4922 scope.go:117] "RemoveContainer" containerID="b0f0cd837dfb9912ddb80af43739b236cd6f2b65cf81e03c9a7a0882ef27011a" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.731765 4922 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l7lrr"] Nov 22 04:25:16 crc kubenswrapper[4922]: E1122 04:25:16.733279 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="extract-content" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733297 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="extract-content" Nov 22 04:25:16 crc kubenswrapper[4922]: E1122 04:25:16.733308 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="extract-utilities" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733317 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="extract-utilities" Nov 22 04:25:16 crc kubenswrapper[4922]: E1122 04:25:16.733348 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="copy" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733355 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="copy" Nov 22 04:25:16 crc kubenswrapper[4922]: E1122 04:25:16.733383 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="registry-server" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733391 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="registry-server" Nov 22 04:25:16 crc kubenswrapper[4922]: E1122 04:25:16.733410 4922 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="gather" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733417 4922 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="gather" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733593 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="copy" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733617 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b7ec5b-cd88-4c4e-944f-415cbf9241ae" containerName="gather" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.733633 4922 memory_manager.go:354] "RemoveStaleState removing state" podUID="688b3dc1-8a23-4914-83bc-a39360e4fd70" containerName="registry-server" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.735338 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.755503 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7lrr"] Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.831337 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-utilities\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.831732 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkrq\" (UniqueName: \"kubernetes.io/projected/eebc5ef7-2a00-455e-977d-5246935e1638-kube-api-access-9qkrq\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.831819 4922 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-catalog-content\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.932739 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkrq\" (UniqueName: \"kubernetes.io/projected/eebc5ef7-2a00-455e-977d-5246935e1638-kube-api-access-9qkrq\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.932811 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-catalog-content\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.932942 4922 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-utilities\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.933389 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-catalog-content\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.933490 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-utilities\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:16 crc kubenswrapper[4922]: I1122 04:25:16.963724 4922 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkrq\" (UniqueName: \"kubernetes.io/projected/eebc5ef7-2a00-455e-977d-5246935e1638-kube-api-access-9qkrq\") pod \"community-operators-l7lrr\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:17 crc kubenswrapper[4922]: I1122 04:25:17.064671 4922 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:17 crc kubenswrapper[4922]: I1122 04:25:17.678673 4922 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l7lrr"] Nov 22 04:25:17 crc kubenswrapper[4922]: I1122 04:25:17.700977 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerStarted","Data":"ec7d660da981347fffd6c1143785ef2cade39cb15561e144358f4b90d97ef765"} Nov 22 04:25:18 crc kubenswrapper[4922]: I1122 04:25:18.716694 4922 generic.go:334] "Generic (PLEG): container finished" podID="eebc5ef7-2a00-455e-977d-5246935e1638" containerID="cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7" exitCode=0 Nov 22 04:25:18 crc kubenswrapper[4922]: I1122 04:25:18.717482 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerDied","Data":"cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7"} Nov 22 04:25:18 crc kubenswrapper[4922]: I1122 04:25:18.721632 4922 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:25:19 crc kubenswrapper[4922]: I1122 04:25:19.744270 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerStarted","Data":"2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751"} Nov 22 04:25:20 crc kubenswrapper[4922]: I1122 04:25:20.756420 4922 generic.go:334] "Generic (PLEG): container finished" podID="eebc5ef7-2a00-455e-977d-5246935e1638" containerID="2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751" exitCode=0 Nov 22 04:25:20 crc kubenswrapper[4922]: I1122 04:25:20.756522 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerDied","Data":"2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751"} Nov 22 04:25:21 crc kubenswrapper[4922]: I1122 04:25:21.770831 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerStarted","Data":"61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101"} Nov 22 04:25:21 crc kubenswrapper[4922]: I1122 04:25:21.813025 4922 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l7lrr" podStartSLOduration=3.378121415 podStartE2EDuration="5.812998744s" podCreationTimestamp="2025-11-22 04:25:16 +0000 UTC" firstStartedPulling="2025-11-22 04:25:18.720662923 +0000 UTC m=+5554.759184825" lastFinishedPulling="2025-11-22 04:25:21.155540262 +0000 UTC m=+5557.194062154" observedRunningTime="2025-11-22 04:25:21.805153946 +0000 UTC m=+5557.843675848" watchObservedRunningTime="2025-11-22 04:25:21.812998744 +0000 UTC m=+5557.851520666" Nov 22 04:25:27 crc kubenswrapper[4922]: I1122 04:25:27.065917 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:27 crc kubenswrapper[4922]: I1122 04:25:27.066555 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:27 crc kubenswrapper[4922]: I1122 04:25:27.124577 4922 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:27 crc kubenswrapper[4922]: I1122 04:25:27.919205 4922 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:27 crc kubenswrapper[4922]: I1122 04:25:27.985140 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7lrr"] Nov 22 04:25:29 crc kubenswrapper[4922]: I1122 04:25:29.864533 4922 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l7lrr" podUID="eebc5ef7-2a00-455e-977d-5246935e1638" containerName="registry-server" containerID="cri-o://61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101" gracePeriod=2 Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.430133 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.547828 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-catalog-content\") pod \"eebc5ef7-2a00-455e-977d-5246935e1638\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.547924 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qkrq\" (UniqueName: \"kubernetes.io/projected/eebc5ef7-2a00-455e-977d-5246935e1638-kube-api-access-9qkrq\") pod \"eebc5ef7-2a00-455e-977d-5246935e1638\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.548111 4922 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-utilities\") pod \"eebc5ef7-2a00-455e-977d-5246935e1638\" (UID: \"eebc5ef7-2a00-455e-977d-5246935e1638\") " Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.549063 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-utilities" (OuterVolumeSpecName: "utilities") pod "eebc5ef7-2a00-455e-977d-5246935e1638" (UID: "eebc5ef7-2a00-455e-977d-5246935e1638"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.553902 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eebc5ef7-2a00-455e-977d-5246935e1638-kube-api-access-9qkrq" (OuterVolumeSpecName: "kube-api-access-9qkrq") pod "eebc5ef7-2a00-455e-977d-5246935e1638" (UID: "eebc5ef7-2a00-455e-977d-5246935e1638"). InnerVolumeSpecName "kube-api-access-9qkrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.650792 4922 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qkrq\" (UniqueName: \"kubernetes.io/projected/eebc5ef7-2a00-455e-977d-5246935e1638-kube-api-access-9qkrq\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.651166 4922 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.881769 4922 generic.go:334] "Generic (PLEG): container finished" podID="eebc5ef7-2a00-455e-977d-5246935e1638" containerID="61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101" exitCode=0 Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.881888 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerDied","Data":"61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101"} Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.881923 4922 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l7lrr" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.881955 4922 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l7lrr" event={"ID":"eebc5ef7-2a00-455e-977d-5246935e1638","Type":"ContainerDied","Data":"ec7d660da981347fffd6c1143785ef2cade39cb15561e144358f4b90d97ef765"} Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.882012 4922 scope.go:117] "RemoveContainer" containerID="61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.926036 4922 scope.go:117] "RemoveContainer" containerID="2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.950281 4922 scope.go:117] "RemoveContainer" containerID="cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.992924 4922 scope.go:117] "RemoveContainer" containerID="61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101" Nov 22 04:25:30 crc kubenswrapper[4922]: E1122 04:25:30.994321 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101\": container with ID starting with 61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101 not found: ID does not exist" containerID="61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.994355 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101"} err="failed to get container status \"61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101\": rpc error: code = NotFound desc = could not find container \"61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101\": container with ID starting with 61f8c0e9da46802be3f187b68b79dbb774abfce4fd2ff8858d40c7ab2eca7101 not found: ID does not exist" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.994380 4922 scope.go:117] "RemoveContainer" containerID="2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751" Nov 22 04:25:30 crc kubenswrapper[4922]: E1122 04:25:30.994830 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751\": container with ID starting with 2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751 not found: ID does not exist" containerID="2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.994874 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751"} err="failed to get container status \"2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751\": rpc error: code = NotFound desc = could not find container \"2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751\": container with ID starting with 2b26e3490edfe5a8252fb083f5a2a7edbdd8e6a0bc1b42c687cef27d363bd751 not found: ID does not exist" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.994896 4922 scope.go:117] "RemoveContainer" containerID="cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7" Nov 22 04:25:30 crc kubenswrapper[4922]: E1122 04:25:30.995319 4922 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7\": container with ID starting with cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7 not found: ID does not exist" containerID="cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7" Nov 22 04:25:30 crc kubenswrapper[4922]: I1122 04:25:30.995404 4922 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7"} err="failed to get container status \"cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7\": rpc error: code = NotFound desc = could not find container \"cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7\": container with ID starting with cd90eba126ee858ae8a64ffdf6651fa8e32a31a8ba5a6433b66874315acfbee7 not found: ID does not exist" Nov 22 04:25:31 crc kubenswrapper[4922]: I1122 04:25:31.914645 4922 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eebc5ef7-2a00-455e-977d-5246935e1638" (UID: "eebc5ef7-2a00-455e-977d-5246935e1638"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:25:31 crc kubenswrapper[4922]: I1122 04:25:31.979069 4922 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eebc5ef7-2a00-455e-977d-5246935e1638-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:32 crc kubenswrapper[4922]: I1122 04:25:32.146129 4922 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l7lrr"] Nov 22 04:25:32 crc kubenswrapper[4922]: I1122 04:25:32.157117 4922 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l7lrr"] Nov 22 04:25:33 crc kubenswrapper[4922]: I1122 04:25:33.317594 4922 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eebc5ef7-2a00-455e-977d-5246935e1638" path="/var/lib/kubelet/pods/eebc5ef7-2a00-455e-977d-5246935e1638/volumes" Nov 22 04:25:41 crc kubenswrapper[4922]: I1122 04:25:41.110252 4922 patch_prober.go:28] interesting pod/machine-config-daemon-b9j6n container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:25:41 crc kubenswrapper[4922]: I1122 04:25:41.110902 4922 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-b9j6n" podUID="402683b1-a29f-4a79-a36c-daf6e8068d0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"